Provider Options
The OpenResponses API lets you configure AI Gateway behavior using providerOptions. The gateway namespace gives you control over provider routing, fallbacks, and restrictions.
Set up automatic fallbacks so if your primary model is unavailable, requests route to backup models in order. Use the models array to specify the fallback chain.
const apiKey = process.env.AI_GATEWAY_API_KEY;
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${apiKey}`,
},
body: JSON.stringify({
model: 'anthropic/claude-sonnet-4.5',
input: [{ type: 'message', role: 'user', content: 'Tell me a fun fact about octopuses.' }],
providerOptions: {
gateway: {
models: ['anthropic/claude-sonnet-4.5', 'openai/gpt-5.2', 'google/gemini-3-flash'],
},
},
}),
});Control the order in which providers are tried using the order array. AI Gateway will attempt providers in the specified order until one succeeds.
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${apiKey}`,
},
body: JSON.stringify({
model: 'google/gemini-3-flash',
input: [{ type: 'message', role: 'user', content: 'Explain quantum computing in one sentence.' }],
providerOptions: {
gateway: {
order: ['google', 'openai', 'anthropic'],
},
},
}),
});Restrict requests to specific providers using the only array. This ensures your requests only go to approved providers, which can be useful for compliance or cost control.
const response = await fetch('https://ai-gateway.vercel.sh/v1/responses', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Bearer ${apiKey}`,
},
body: JSON.stringify({
model: 'zai/glm-4.7',
input: [{ type: 'message', role: 'user', content: 'What makes a great cup of coffee?' }],
providerOptions: {
gateway: {
only: ['zai', 'deepseek'],
},
},
}),
});Was this helpful?