Anthropic-Compatible API
AI Gateway provides Anthropic-compatible API endpoints, so you can use the Anthropic SDK and tools like Claude Code through a unified gateway with only a URL change.
The Anthropic-compatible API implements the same specification as the Anthropic Messages API.
For more on using AI Gateway with Claude Code, see the Claude Code instructions.
The Anthropic-compatible API is available at the following base URL:
https://ai-gateway.vercel.sh
The Anthropic-compatible API supports the same authentication methods as the main AI Gateway:
- API key: Use your AI Gateway API key with the
x-api-keyheader orAuthorization: Bearer <token>header - OIDC token: Use your Vercel OIDC token with the
Authorization: Bearer <token>header
You only need to use one of these forms of authentication. If an API key is specified it will take precedence over any OIDC token, even if the API key is invalid.
The AI Gateway supports the following Anthropic-compatible endpoint:
POST /v1/messages- Create messages with support for streaming, tool calls, extended thinking, and file attachments
For advanced features, see:
- Advanced features - Extended thinking and web search
Claude Code is Anthropic's agentic coding tool. You can configure it to use Vercel AI Gateway, enabling you to:
- Route requests through multiple AI providers
- Monitor traffic and spend in your AI Gateway Overview
- View detailed traces in Vercel Observability under AI
- Use any model available through the gateway
Configure Claude Code to use the AI Gateway by setting these environment variables:
Variable Value ANTHROPIC_BASE_URLhttps://ai-gateway.vercel.shANTHROPIC_AUTH_TOKENYour AI Gateway API key ANTHROPIC_API_KEY""(empty string)Setting
ANTHROPIC_API_KEYto an empty string is important. Claude Code checks this variable first, and if it's set to a non-empty value, it will use that instead ofANTHROPIC_AUTH_TOKEN.Add this alias to your
~/.zshrc(or~/.bashrc):alias claude-vercel='ANTHROPIC_BASE_URL="https://ai-gateway.vercel.sh" ANTHROPIC_AUTH_TOKEN="your-api-key-here" ANTHROPIC_API_KEY="" claude'Then reload your shell:
source ~/.zshrcFor more flexibility (e.g., adding additional logic), create a wrapper script at
~/bin/claude-vercel:claude-vercel#!/usr/bin/env bash # Routes Claude Code through Vercel AI Gateway ANTHROPIC_BASE_URL="https://ai-gateway.vercel.sh" \ ANTHROPIC_AUTH_TOKEN="your-api-key-here" \ ANTHROPIC_API_KEY="" \ claude "$@"Make it executable and ensure
~/binis in your PATH:mkdir -p ~/bin chmod +x ~/bin/claude-vercel echo 'export PATH="$HOME/bin:$PATH"' >> ~/.zshrc source ~/.zshrcRun
claude-vercelto start Claude Code with AI Gateway:claude-vercelYour requests will now be routed through Vercel AI Gateway.
You can override the default models that Claude Code uses by setting additional environment variables:
ANTHROPIC_DEFAULT_SONNET_MODEL="kwaipilot/kat-coder-pro-v1" ANTHROPIC_DEFAULT_OPUS_MODEL="zai/glm-4.7" ANTHROPIC_DEFAULT_HAIKU_MODEL="minimax/minimax-m2.1"This allows you to use any model available through the AI Gateway while still using Claude Code's familiar interface.
Models vary widely in their support for tools, extended thinking, and other features that Claude Code relies on. Performance may differ significantly depending on the model and provider you select.
You can use the AI Gateway's Anthropic-compatible API with the official Anthropic SDK. Point your client to the AI Gateway's base URL and use your AI Gateway API key or OIDC token for authentication.
The examples and content in this section are not comprehensive. For complete documentation on available parameters, response formats, and advanced features, refer to the Anthropic Messages API documentation.
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh',
});
const message = await anthropic.messages.create({
model: 'anthropic/claude-sonnet-4.5',
max_tokens: 1024,
messages: [{ role: 'user', content: 'Hello, world!' }],
});import os
import anthropic
client = anthropic.Anthropic(
api_key=os.getenv('AI_GATEWAY_API_KEY'),
base_url='https://ai-gateway.vercel.sh'
)
message = client.messages.create(
model='anthropic/claude-sonnet-4.5',
max_tokens=1024,
messages=[
{'role': 'user', 'content': 'Hello, world!'}
]
)The messages endpoint supports the following parameters:
model(string): The model to use (e.g.,anthropic/claude-sonnet-4.5)max_tokens(integer): Maximum number of tokens to generatemessages(array): Array of message objects withroleandcontentfields
stream(boolean): Whether to stream the response. Defaults tofalsetemperature(number): Controls randomness in the output. Range: 0-1top_p(number): Nucleus sampling parameter. Range: 0-1top_k(integer): Top-k sampling parameterstop_sequences(array): Stop sequences for the generationtools(array): Array of tool definitions for function callingtool_choice(object): Controls which tools are calledthinking(object): Extended thinking configurationsystem(string or array): System prompt
The API returns standard HTTP status codes and error responses:
400 Bad Request: Invalid request parameters401 Unauthorized: Invalid or missing authentication403 Forbidden: Insufficient permissions404 Not Found: Model or endpoint not found429 Too Many Requests: Rate limit exceeded500 Internal Server Error: Server error
{
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "Invalid request: missing required parameter 'max_tokens'"
}
}Was this helpful?