Anthropic SDK
Pass-through endpoints for Anthropic - call provider-specific endpoint, in native format (no translation).
| Feature | Supported | Notes | 
|---|---|---|
| Cost Tracking | โ | supports all models on /messagesendpoint | 
| Logging | โ | works across all integrations | 
| End-user Tracking | โ | disable prometheus tracking via litellm.disable_end_user_cost_tracking_prometheus_only | 
| Streaming | โ | 
Just replace https://api.anthropic.com with LITELLM_PROXY_BASE_URL/anthropic
Example Usageโ
- curl
- Anthropic Python SDK
curl --request POST \
  --url http://0.0.0.0:4000/anthropic/v1/messages \
  --header 'accept: application/json' \
  --header 'content-type: application/json' \
  --header "Authorization: bearer sk-anything" \
  --data '{
        "model": "claude-3-5-sonnet-20241022",
        "max_tokens": 1024,
        "messages": [
            {"role": "user", "content": "Hello, world"}
        ]
    }'
from anthropic import Anthropic
# Initialize client with proxy base URL
client = Anthropic(
    base_url="http://0.0.0.0:4000/anthropic", # <proxy-base-url>/anthropic
    api_key="sk-anything" # proxy virtual key
)
# Make a completion request
response = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hello, world"}
    ]
)
print(response)
Supports ALL Anthropic Endpoints (including streaming).
Quick Startโ
Let's call the Anthropic /messages endpoint
- Add Anthropic API Key to your environment
export ANTHROPIC_API_KEY=""
- Start LiteLLM Proxy
litellm
# RUNNING on http://0.0.0.0:4000
- Test it!
Let's call the Anthropic /messages endpoint
curl http://0.0.0.0:4000/anthropic/v1/messages \
     --header "x-api-key: $LITELLM_API_KEY" \
     --header "anthropic-version: 2023-06-01" \
     --header "content-type: application/json" \
     --data \
    '{
        "model": "claude-3-5-sonnet-20241022",
        "max_tokens": 1024,
        "messages": [
            {"role": "user", "content": "Hello, world"}
        ]
    }'
Examplesโ
Anything after http://0.0.0.0:4000/anthropic is treated as a provider-specific route, and handled accordingly.
Key Changes:
| Original Endpoint | Replace With | 
|---|---|
| https://api.anthropic.com | http://0.0.0.0:4000/anthropic(LITELLM_PROXY_BASE_URL="http://0.0.0.0:4000") | 
| bearer $ANTHROPIC_API_KEY | bearer anything(usebearer LITELLM_VIRTUAL_KEYif Virtual Keys are setup on proxy) | 
Example 1: Messages endpointโ
LiteLLM Proxy Callโ
curl --request POST \
  --url http://0.0.0.0:4000/anthropic/v1/messages \
  --header "x-api-key: $LITELLM_API_KEY" \
    --header "anthropic-version: 2023-06-01" \
    --header "content-type: application/json" \
  --data '{
    "model": "claude-3-5-sonnet-20241022",
    "max_tokens": 1024,
    "messages": [
        {"role": "user", "content": "Hello, world"}
    ]
  }'
Direct Anthropic API Callโ
curl https://api.anthropic.com/v1/messages \
     --header "x-api-key: $ANTHROPIC_API_KEY" \
     --header "anthropic-version: 2023-06-01" \
     --header "content-type: application/json" \
     --data \
    '{
        "model": "claude-3-5-sonnet-20241022",
        "max_tokens": 1024,
        "messages": [
            {"role": "user", "content": "Hello, world"}
        ]
    }'
Example 2: Token Counting APIโ
LiteLLM Proxy Callโ
curl --request POST \
    --url http://0.0.0.0:4000/anthropic/v1/messages/count_tokens \
    --header "x-api-key: $LITELLM_API_KEY" \
    --header "anthropic-version: 2023-06-01" \
    --header "anthropic-beta: token-counting-2024-11-01" \
    --header "content-type: application/json" \
    --data \
    '{
        "model": "claude-3-5-sonnet-20241022",
        "messages": [
            {"role": "user", "content": "Hello, world"}
        ]
    }'
Direct Anthropic API Callโ
curl https://api.anthropic.com/v1/messages/count_tokens \
     --header "x-api-key: $ANTHROPIC_API_KEY" \
     --header "anthropic-version: 2023-06-01" \
     --header "anthropic-beta: token-counting-2024-11-01" \
     --header "content-type: application/json" \
     --data \
'{
    "model": "claude-3-5-sonnet-20241022",
    "messages": [
        {"role": "user", "content": "Hello, world"}
    ]
}'
Example 3: Batch Messagesโ
LiteLLM Proxy Callโ
curl --request POST \
    --url http://0.0.0.0:4000/anthropic/v1/messages/batches \
    --header "x-api-key: $LITELLM_API_KEY" \
    --header "anthropic-version: 2023-06-01" \
    --header "anthropic-beta: message-batches-2024-09-24" \
    --header "content-type: application/json" \
    --data \
'{
    "requests": [
        {
            "custom_id": "my-first-request",
            "params": {
                "model": "claude-3-5-sonnet-20241022",
                "max_tokens": 1024,
                "messages": [
                    {"role": "user", "content": "Hello, world"}
                ]
            }
        },
        {
            "custom_id": "my-second-request",
            "params": {
                "model": "claude-3-5-sonnet-20241022",
                "max_tokens": 1024,
                "messages": [
                    {"role": "user", "content": "Hi again, friend"}
                ]
            }
        }
    ]
}'
Direct Anthropic API Callโ
curl https://api.anthropic.com/v1/messages/batches \
     --header "x-api-key: $ANTHROPIC_API_KEY" \
     --header "anthropic-version: 2023-06-01" \
     --header "anthropic-beta: message-batches-2024-09-24" \
     --header "content-type: application/json" \
     --data \
'{
    "requests": [
        {
            "custom_id": "my-first-request",
            "params": {
                "model": "claude-3-5-sonnet-20241022",
                "max_tokens": 1024,
                "messages": [
                    {"role": "user", "content": "Hello, world"}
                ]
            }
        },
        {
            "custom_id": "my-second-request",
            "params": {
                "model": "claude-3-5-sonnet-20241022",
                "max_tokens": 1024,
                "messages": [
                    {"role": "user", "content": "Hi again, friend"}
                ]
            }
        }
    ]
}'
Advancedโ
Pre-requisites
Use this, to avoid giving developers the raw Anthropic API key, but still letting them use Anthropic endpoints.
Use with Virtual Keysโ
- Setup environment
export DATABASE_URL=""
export LITELLM_MASTER_KEY=""
export COHERE_API_KEY=""
litellm
# RUNNING on http://0.0.0.0:4000
- Generate virtual key
curl -X POST 'http://0.0.0.0:4000/key/generate' \
-H 'Authorization: Bearer sk-1234' \
-H 'Content-Type: application/json' \
-d '{}'
Expected Response
{
    ...
    "key": "sk-1234ewknldferwedojwojw"
}
- Test it!
curl --request POST \
  --url http://0.0.0.0:4000/anthropic/v1/messages \
  --header 'accept: application/json' \
  --header 'content-type: application/json' \
  --header "Authorization: bearer sk-1234ewknldferwedojwojw" \
  --data '{
    "model": "claude-3-5-sonnet-20241022",
    "max_tokens": 1024,
    "messages": [
        {"role": "user", "content": "Hello, world"}
    ]
  }'
Send litellm_metadata (tags, end-user cost tracking)โ
- curl
- Anthropic Python SDK
curl --request POST \
  --url http://0.0.0.0:4000/anthropic/v1/messages \
  --header 'accept: application/json' \
  --header 'content-type: application/json' \
  --header "Authorization: bearer sk-anything" \
  --data '{
    "model": "claude-3-5-sonnet-20241022",
    "max_tokens": 1024,
    "messages": [
        {"role": "user", "content": "Hello, world"}
    ],
    "litellm_metadata": {
        "tags": ["test-tag-1", "test-tag-2"], 
        "user": "test-user" # track end-user/customer cost
    }
  }'
from anthropic import Anthropic
client = Anthropic(
    base_url="http://0.0.0.0:4000/anthropic",
    api_key="sk-anything"
)
response = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    messages=[
        {"role": "user", "content": "Hello, world"}
    ],
    extra_body={
        "litellm_metadata": {
            "tags": ["test-tag-1", "test-tag-2"], 
            "user": "test-user" # track end-user/customer cost
        }
    }, 
    ## OR## 
    metadata={ # anthropic native param - https://docs.anthropic.com/en/api/messages
        "user_id": "test-user" # track end-user/customer cost
    }
)
print(response)