Gateway (recommended)
The gateway is the primary way to configure LLM access. All built-in providers (Anthropic, OpenAI, Google, xAI, and 20+ more) are available through the gateway with no per-provider key setup.Polpo managed (default)
On Polpo Cloud, the managed gateway is enabled by default. No configuration needed — deploy your agents and they work immediately.Custom gateway
Use your own OpenAI-compatible gateway endpoint. Configure it in the dashboard or in project settings:- Dashboard (cloud)
- polpo.json (self-hosted)
Go to LLM Keys in the sidebar, select Custom gateway, enter your gateway URL, API key, and optional headers.
Gateway configuration
| Field | Type | Description |
|---|---|---|
url | string | Gateway endpoint URL |
apiKey | string | API key (set via env var or BYOK, not in plaintext config) |
headers | object | Custom headers sent with every LLM request |
Supported gateways
| Gateway | URL pattern | Notes |
|---|---|---|
| Vercel AI Gateway | https://ai-gateway.vercel.sh/v1 | Failover, cost tracking, caching |
| OpenRouter | https://openrouter.ai/api/v1 | 100+ models from a single key |
| LiteLLM | https://your-litellm-proxy.com/v1 | Self-hosted proxy, 100+ providers |
| Ollama | http://localhost:11434/v1 | Local models, no API key needed |
Provider keys (BYOK)
For direct provider access — bypassing the gateway — you can set individual provider API keys. This is useful when you want to use your own provider accounts, need a specific provider feature not available through the gateway, or are running locally.Set a key
- CLI
- Dashboard
- Environment variable
Supported providers
Polpo supports 20+ LLM providers natively, including Anthropic, OpenAI, Google, xAI, OpenRouter, Groq, Mistral, Cerebras, MiniMax, Amazon Bedrock, Azure OpenAI, Google Vertex, and Hugging Face. See the full list with slugs and model examples in the LLM & Providers reference.Local development
For local development, set provider keys as environment variables:Deploy syncs keys
When you runpolpo deploy, local environment keys are detected and pushed to the cloud automatically:
How it works at runtime
With gateway (default)
- Agent calls the LLM with its configured model (e.g.
anthropic/claude-sonnet-4-5) - The gateway routes the request to the correct provider
- Streaming response is returned to the agent
With provider keys (BYOK)
- Agent calls the LLM with its configured model
- Polpo decrypts the provider key and injects it into the sandbox
- The LLM call uses your key directly — no proxy
- When the sandbox is destroyed, the decrypted key is discarded