Getting Started
This quickstart will walk you through making an AI model request with Vercel's AI Gateway. While this guide uses the AI SDK, you can also integrate with the OpenAI SDK, Anthropic SDK, OpenResponses API, or other community frameworks.
Start by creating a new directory using the
mkdircommand. Change into your new directory and then run thepnpm initcommand, which will create apackage.json.Terminalmkdir demo cd demo pnpm initInstall the AI SDK package,
ai, along with other necessary dependencies.Terminalnpm install ai dotenv @types/node tsx typescriptTerminalyarn add ai dotenv @types/node tsx typescriptTerminalpnpm add ai dotenv @types/node tsx typescriptTerminalbun add ai dotenv @types/node tsx typescriptdotenvis used to access environment variables (your AI Gateway API key) within your application. Thetsxpackage is a TypeScript runner that allows you to run your TypeScript code. Thetypescriptpackage is the TypeScript compiler. The@types/nodepackage is the TypeScript definitions for the Node.js API.Go to the AI Gateway API Keys page in your Vercel dashboard and click Create key to generate a new API key.
Once you have the API key, create a
.env.localfile and save your API key:.env.localAI_GATEWAY_API_KEY=your_ai_gateway_api_keyInstead of using an API key, you can use OIDC tokens to authenticate your requests.
The AI Gateway provider will default to using the
AI_GATEWAY_API_KEYenvironment variable.Create an
index.tsfile in the root of your project and add the following code:index.tsimport { streamText } from 'ai'; import 'dotenv/config'; async function main() { const result = streamText({ model: 'openai/gpt-5.2', prompt: 'Invent a new holiday and describe its traditions.', }); for await (const textPart of result.textStream) { process.stdout.write(textPart); } console.log(); console.log('Token usage:', await result.usage); console.log('Finish reason:', await result.finishReason); } main().catch(console.error);Now, run your script:
Terminalpnpm tsx index.tsYou should see the AI model's response to your prompt.
Continue with the AI SDK documentation to learn about configuration options, provider and model routing with fallbacks, and integration examples.
The AI Gateway provides OpenAI-compatible API endpoints that allow you to use existing OpenAI client libraries and tools with the AI Gateway.
The OpenAI-compatible API includes:
- Model Management: List and retrieve the available models
- Chat Completions: Create chat completions that support streaming, images, and file attachments
- Tool Calls: Call functions with automatic or explicit tool selection
- Existing Tool Integration: Use your existing OpenAI client libraries and tools without needing modifications
- Multiple Languages: Use the OpenAI SDK in TypeScript and Python, or any language via the REST API
import OpenAI from 'openai';
import 'dotenv/config';
const client = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
async function main() {
const response = await client.chat.completions.create({
model: 'anthropic/claude-sonnet-4.5',
messages: [
{
role: 'user',
content: 'Invent a new holiday and describe its traditions.',
},
],
});
console.log(response.choices[0].message.content);
}
main().catch(console.error);import os
from openai import OpenAI
from dotenv import load_dotenv
load_dotenv()
client = OpenAI(
api_key=os.getenv('AI_GATEWAY_API_KEY'),
base_url='https://ai-gateway.vercel.sh/v1',
)
response = client.chat.completions.create(
model='anthropic/claude-sonnet-4.5',
messages=[
{
'role': 'user',
'content': 'Invent a new holiday and describe its traditions.',
},
],
)
print(response.choices[0].message.content)Learn more about using the OpenAI SDK with the AI Gateway in the OpenAI-Compatible API page.
The AI Gateway provides Anthropic-compatible API endpoints that allow you to use the Anthropic SDK and tools like Claude Code with the AI Gateway.
The Anthropic-compatible API includes:
- Messages API: Create messages with support for streaming and multi-turn conversations
- Tool Calls: Call functions with automatic or explicit tool selection
- Extended Thinking: Enable extended thinking for complex reasoning tasks
- File Attachments: Attach files and images to your messages
- Multiple Languages: Use the Anthropic SDK in TypeScript and Python, or any language via the REST API
import Anthropic from '@anthropic-ai/sdk';
import 'dotenv/config';
const client = new Anthropic({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh',
});
async function main() {
const message = await client.messages.create({
model: 'anthropic/claude-sonnet-4.5',
max_tokens: 1024,
messages: [
{
role: 'user',
content: 'Invent a new holiday and describe its traditions.',
},
],
});
console.log(message.content[0].text);
}
main().catch(console.error);import os
import anthropic
from dotenv import load_dotenv
load_dotenv()
client = anthropic.Anthropic(
api_key=os.getenv('AI_GATEWAY_API_KEY'),
base_url='https://ai-gateway.vercel.sh',
)
message = client.messages.create(
model='anthropic/claude-sonnet-4.5',
max_tokens=1024,
messages=[
{
'role': 'user',
'content': 'Invent a new holiday and describe its traditions.',
},
],
)
print(message.content[0].text)Learn more about using the Anthropic SDK with the AI Gateway in the Anthropic-Compatible API page.
The OpenResponses API is an open standard for AI model interactions that provides a unified, provider-agnostic interface with built-in support for streaming, tool calling, and reasoning.
The OpenResponses API includes:
- Text Generation: Generate text responses from prompts
- Streaming: Stream tokens as they're generated
- Tool Calling: Define tools the model can call
- Reasoning: Enable extended thinking for complex tasks
- Provider Options: Configure model fallbacks and provider-specific settings
import 'dotenv/config';
async function main() {
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${process.env.AI_GATEWAY_API_KEY}`,
},
body: JSON.stringify({
model: 'anthropic/claude-sonnet-4.5',
input: [
{
type: 'message',
role: 'user',
content: 'Invent a new holiday and describe its traditions.',
},
],
}),
});
const result = await response.json();
console.log(result.output[0].content[0].text);
}
main().catch(console.error);import os
import requests
from dotenv import load_dotenv
load_dotenv()
response = requests.post(
'https://ai-gateway.vercel.sh/v1/responses',
headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {os.getenv("AI_GATEWAY_API_KEY")}',
},
json={
'model': 'anthropic/claude-sonnet-4.5',
'input': [
{
'type': 'message',
'role': 'user',
'content': 'Invent a new holiday and describe its traditions.',
},
],
},
)
result = response.json()
print(result['output'][0]['content'][0]['text'])curl -X POST "https://ai-gateway.vercel.sh/v1/responses" \
-H "Authorization: Bearer $AI_GATEWAY_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic/claude-sonnet-4.5",
"input": [
{
"type": "message",
"role": "user",
"content": "Invent a new holiday and describe its traditions."
}
]
}'Learn more about the OpenResponses API in the OpenResponses API documentation.
AI Gateway works with any framework that supports the OpenAI API or AI SDK v5/v6, and also supports tools like Claude Code.
See the framework integrations section to learn more about using AI Gateway with community frameworks.
Was this helpful?