Getting Claude Code to work with an [LLM gateway](https://code.claude.com/docs/en/llm-gateway) using [LiteLLM Proxy](https://docs.litellm.ai/docs/proxy/quick_start).
## Configuring the Proxy
Create a `config.yaml` for the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/configs). Models are grouped by provider prefix for clarity.
*Not all models are shown for brevity*
```yaml
litellm_settings:
drop_params: true
cache: true
cache_params:
type: redis
host: os.environ/REDIS_HOST
port: os.environ/REDIS_PORT
general_settings:
store_model_in_db: true
master_key: os.environ/LITELLM_MASTER_KEY
database_url: os.environ/DATABASE_URL
forward_client_headers_to_llm_api: true
model_list:
# ── Claude Max (cm/) — OAuth token forwarded from client ──────────────────
- model_name: cm/claude-opus-4-6
litellm_params:
model: anthropic/claude-opus-4-6
model_info:
supports_function_calling: true
- model_name: cm/claude-sonnet-4-6
litellm_params:
model: anthropic/claude-sonnet-4-6
model_info:
supports_function_calling: true
- model_name: cm/claude-haiku-4-5
litellm_params:
model: anthropic/claude-haiku-4-5
model_info:
supports_function_calling: true
# ── GitHub Copilot (gc/) ──────────────────────────────────────────────────
- model_name: gc/claude-opus-4.6
litellm_params:
model: github_copilot/claude-opus-4.6
extra_headers: {"Editor-Version": "vscode/1.85.1", "Copilot-Integration-Id": "vscode-chat"}
- model_name: gc/claude-sonnet-4.6
litellm_params:
model: github_copilot/claude-sonnet-4.6
extra_headers: {"Editor-Version": "vscode/1.85.1", "Copilot-Integration-Id": "vscode-chat"}
- model_name: gc/claude-haiku-4.5
litellm_params:
model: github_copilot/claude-haiku-4.5
extra_headers: {"Editor-Version": "vscode/1.85.1", "Copilot-Integration-Id": "vscode-chat"}
- model_name: gc/gpt-5-mini
litellm_params:
model: github_copilot/gpt-5-mini
extra_headers: {"Editor-Version": "vscode/1.85.1", "Copilot-Integration-Id": "vscode-chat"}
```
## Running with Docker Compose
The easiest way to run LiteLLM locally with Postgres and Redis is with Docker Compose. Create a `docker-compose.yml` alongside your `config.yaml`:
```yaml
services:
litellm:
image: docker.litellm.ai/berriai/litellm:main-latest
command: ["--config", "/app/config.yaml", "--port", "4000"]
ports:
- "127.0.0.1:4000:4000"
environment:
LITELLM_MASTER_KEY: "${LITELLM_MASTER_KEY}"
DATABASE_URL: "postgresql://postgres:${POSTGRES_PASSWORD}@postgres:5432/litellm"
REDIS_HOST: redis
REDIS_PORT: "6379"
volumes:
- ./config.yaml:/app/config.yaml:ro
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_healthy
postgres:
image: postgres:16-alpine
environment:
POSTGRES_DB: litellm
POSTGRES_PASSWORD: "${POSTGRES_PASSWORD}"
volumes:
- litellm-postgres:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "postgres"]
interval: 5s
timeout: 5s
retries: 5
redis:
image: redis:7-alpine
volumes:
- litellm-redis:/data
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 3s
retries: 5
volumes:
litellm-postgres:
litellm-redis:
```
Store secrets in a `.env` file (never commit this):
```ini
LITELLM_MASTER_KEY=sk-...your-master-key...
POSTGRES_PASSWORD=...strong-password...
```
Then start everything:
```zsh
docker compose up -d
```
The proxy will be available at `http://localhost:4000`. View the UI at <http://localhost:4000/ui>
## Configuring Claude
The next thing is just to configure your Claude Code settings so the requests go through the LiteLLM proxy. I have set the following environment variables for CC.
```zsh
# Sets the generated LiteLLM Proxy Master Key
ANTHROPIC_AUTH_TOKEN="$(security find-generic-password -s litellm-proxy -a claude-code -w)"
ANTHROPIC_BASE_URL=http://localhost:4000
ANTHROPIC_MODEL=cm/claude-sonnet-4-6
ANTHROPIC_SMALL_FAST_MODEL=gc/gpt-5-mini
```
After you have everything set you can run Claude!
## References & Links
- [Using Claude Code with GitHub Copilot: A Guide](https://blog.f12.no/wp/2025/09/22/using-claude-code-with-github-copilot-a-guide/)
- [claude-code-over-github-copilot](https://github.com/kjetiljd/claude-code-over-github-copilot)