Skip to main content
Agents need access to an LLM to work. Polpo routes all LLM calls through a gateway — a single endpoint that handles provider routing, failover, and cost tracking. You can use the Polpo managed gateway (default, zero config) or bring your own. The gateway is the primary way to configure LLM access. All built-in providers (Anthropic, OpenAI, Google, xAI, and 20+ more) are available through the gateway with no per-provider key setup.

Polpo managed (default)

On Polpo Cloud, the managed gateway is enabled by default. No configuration needed — deploy your agents and they work immediately.

Custom gateway

Use your own OpenAI-compatible gateway endpoint. Configure it in the dashboard or in project settings:
Go to LLM Keys in the sidebar, select Custom gateway, enter your gateway URL, API key, and optional headers.

Gateway configuration

FieldTypeDescription
urlstringGateway endpoint URL
apiKeystringAPI key (set via env var or BYOK, not in plaintext config)
headersobjectCustom headers sent with every LLM request

Supported gateways

GatewayURL patternNotes
Vercel AI Gatewayhttps://ai-gateway.vercel.sh/v1Failover, cost tracking, caching
OpenRouterhttps://openrouter.ai/api/v1100+ models from a single key
LiteLLMhttps://your-litellm-proxy.com/v1Self-hosted proxy, 100+ providers
Ollamahttp://localhost:11434/v1Local models, no API key needed
Any OpenAI-compatible endpoint works as a gateway.

Provider keys (BYOK)

For direct provider access — bypassing the gateway — you can set individual provider API keys. This is useful when you want to use your own provider accounts, need a specific provider feature not available through the gateway, or are running locally.

Set a key

polpo byok set anthropic
You’ll be prompted for the key. Or pass it directly:
polpo byok set anthropic --key sk-ant-...
Keys are encrypted immediately with AES-256-GCM. They are never stored in plaintext and never returned from the API.

Supported providers

Polpo supports 20+ LLM providers natively, including Anthropic, OpenAI, Google, xAI, OpenRouter, Groq, Mistral, Cerebras, MiniMax, Amazon Bedrock, Azure OpenAI, Google Vertex, and Hugging Face. See the full list with slugs and model examples in the LLM & Providers reference.

Local development

For local development, set provider keys as environment variables:
export ANTHROPIC_API_KEY=sk-ant-...
polpo start
Or configure a local gateway (e.g. Ollama for fully offline usage):
{
  "settings": {
    "gateway": {
      "url": "http://localhost:11434/v1"
    }
  }
}

Deploy syncs keys

When you run polpo deploy, local environment keys are detected and pushed to the cloud automatically:
$ polpo deploy

Detected: ANTHROPIC_API_KEY, XAI_API_KEY
Push LLM keys to cloud? (y/n): y

How it works at runtime

With gateway (default)

  1. Agent calls the LLM with its configured model (e.g. anthropic/claude-sonnet-4-5)
  2. The gateway routes the request to the correct provider
  3. Streaming response is returned to the agent

With provider keys (BYOK)

  1. Agent calls the LLM with its configured model
  2. Polpo decrypts the provider key and injects it into the sandbox
  3. The LLM call uses your key directly — no proxy
  4. When the sandbox is destroyed, the decrypted key is discarded