CodePilotCodePilot

Provider Configuration

Configure LLM providers to power CodePilot.

Provider Configuration

CodePilot supports multiple LLM providers. You can configure several providers simultaneously and use different models in different conversations.

Authentication Overview

CodePilot has two ways to obtain API credentials:

1. CLI Environment Authentication (Auto-Detected)

If you have the ANTHROPIC_API_KEY environment variable set in your shell, CodePilot automatically detects it on startup and uses it as a built-in provider.

export ANTHROPIC_API_KEY="sk-ant-..."

Note: Configurations changed via claude config set or Claude Code's /config command are not recognized by CodePilot. CodePilot only reads shell environment variables and does not share Claude Code CLI's internal configuration. If you switched accounts/keys in the CLI via cc switch or similar methods, you need to manually reconfigure the corresponding key in CodePilot's Settings > Providers.

After modifying environment variables, you need to restart CodePilot for changes to take effect.

2. Manually Adding Providers

Manually add API keys in Settings > Providers. These credentials are stored in CodePilot's local database, independent of the CLI environment.

This is ideal for scenarios where you need multiple providers or non-Anthropic services.

Priority

When sending a message, CodePilot determines which provider to use in the following order:

  1. Conversation-specific — The provider manually selected in the conversation header
  2. Global default — The provider marked as "Default" in the provider list
  3. Environment variable — If no providers are configured, falls back to credentials from the shell environment

Supported Providers

Anthropic (Official)

Direct connection to the Anthropic API, using Claude models (Opus, Sonnet, Haiku).

  • Auth: API Key
  • Note: If you only use Anthropic, CLI environment authentication is sufficient — no need to add manually

Anthropic (Third-Party Compatible)

Connect to third-party endpoints compatible with the Anthropic API format.

  • Auth: API Key or Auth Token + custom Base URL. When adding, you need to select the authentication type:
    • API Key — The key provided by the service starts with sk-, or the documentation explicitly labels it as an API Key. Most providers use this method, corresponding to the ANTHROPIC_API_KEY environment variable
    • Auth Token — The service provides an OAuth Token or other form of access token, typically not starting with sk-. Some subscription-based services (such as Kimi Coding Plan, 火山引擎 Ark) use this method, corresponding to the ANTHROPIC_AUTH_TOKEN environment variable
    • If unsure, try API Key first; if authentication fails, switch to Auth Token
  • Model Mapping: Some third-party providers require their own model names (rather than Anthropic's original model names). If you encounter a model unavailable error, click More Options at the bottom of the configuration form and enter the provider's required model identifier in the Model Name field

Chinese Providers

CodePilot includes built-in configuration presets for major Chinese providers. After selecting one, the Base URL and default model are auto-filled:

ProviderDescriptionAuth Method
智谱 GLM (Domestic)智谱 AI GLM seriesAPI Key
智谱 GLM (International)智谱 AI international endpointAPI Key
Kimi Coding PlanMoonshot Kimi coding editionAuth Token
MoonshotMoonshot APIAPI Key
MiniMax (Domestic)MiniMax abab seriesAPI Key
MiniMax (International)MiniMax international endpointAPI Key
火山引擎 ArkByteDance VolcengineAuth Token
阿里云百炼 Coding PlanAlibaba Cloud Tongyi seriesAPI Key

Auth Token type: Kimi Coding Plan and 火山引擎 Ark use ANTHROPIC_AUTH_TOKEN rather than ANTHROPIC_API_KEY for authentication. When adding them in CodePilot, the system automatically handles the authentication method — you just need to enter the key provided by the respective platform.

OpenRouter

Access multiple model providers (Anthropic, OpenAI, Google, Meta, etc.) through OpenRouter's unified interface.

  • Auth: API Key
  • Advantage: One key to access multiple models, with automatic routing and failover

AWS Bedrock

Use Claude through AWS infrastructure.

  • Auth: Environment variables — requires AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION
  • Note: After adding in CodePilot, the system reads your AWS environment variables for authentication. No need to enter keys in the UI.

Google Vertex

Use Claude and Gemini through Google Cloud.

  • Auth: Environment variables — requires Google Cloud service account credentials
  • Note: Similar to Bedrock, authenticates via environment variables

Google Gemini (Image)

Gemini image generation API, used by the design Agent.

  • Auth: API Key
  • Note: This is a provider specifically for image generation, not for text conversations

Custom API (OpenAI-Compatible)

Any endpoint compatible with the OpenAI Chat Completions API.

  • Auth: API Key + Base URL
  • Use case: Connect local models (Ollama, LM Studio, vLLM) or other third-party proxies

LiteLLM

Unified proxy supporting 100+ LLM providers.

  • Auth: API Key + Base URL

Adding a Provider

  1. Open Settings > Providers
  2. Click Add Provider
  3. Select the provider type (or a Chinese provider preset)
  4. Enter credentials:
    • API Key type: Paste the key
    • Custom endpoint: Also enter the Base URL
    • Environment variable type (Bedrock / Vertex): Ensure environment variables are set
  5. Select a default model
  6. Click Save

Switching Providers

  • Select from the provider picker in the conversation header
  • Each conversation remembers the provider used
  • You can switch mid-conversation; subsequent messages will use the new provider
  • Click Set as Default in the provider list to set the global default

FAQ

Environment variables are set but CodePilot doesn't detect them

  • Confirm the environment variables are available in the shell environment when CodePilot starts
  • If set via .zshrc / .bashrc, make sure you restarted CodePilot (not just refreshed) after the change
  • Apps launched via macOS Launchpad may not inherit terminal environment variables — try launching from the terminal or manually adding the provider

API key is valid but requests fail

  • Check if the account has sufficient balance
  • Check if the key has model access permissions
  • For Chinese providers, check if the corresponding API endpoint is reachable from your network
  • For AWS Bedrock, check if IAM permissions include bedrock:InvokeModel

Conversation issues after switching providers

  • Different providers have different context window sizes; switching may cause errors if the context is too long
  • Some providers do not support all Claude Code features (such as tool use); certain operations may be unavailable after switching

How to use local models

Select the Custom API (OpenAI-Compatible) type, and enter the local service address as the Base URL (e.g., http://localhost:11434/v1). The API key can be any value (local services typically don't validate it).