Skip to main content
Vercel AI Gateway provides a unified endpoint for routing to multiple LLM providers with built-in observability, caching, and rate limiting.

Setup

export AI_GATEWAY_API_KEY=...

Config

{
  "providers": {
    "vercel-ai-gateway": "${AI_GATEWAY_API_KEY}"
  }
}

Use it

{
  "agents": [
    { "name": "coder", "model": "vercel-ai-gateway:my-model" }
  ]
}

Provider Details

Provider IDvercel-ai-gateway
Env variableAI_GATEWAY_API_KEY
API typeVercel AI Gateway

Notes

  • Available models depend on your Vercel AI Gateway configuration.
  • Useful if you’re already on the Vercel platform and want centralized LLM management.