Skip to main content
Agents need access to an LLM to work. Configure your provider API key once — Polpo encrypts it and injects it at runtime.

Set a key

polpo byok set anthropic
You’ll be prompted for the key. Or pass it directly:
polpo byok set anthropic --key sk-ant-...
The key is encrypted immediately with AES-256-GCM. It’s never stored in plaintext and never returned from the API.

Supported providers

Polpo supports 20+ LLM providers natively, including Anthropic, OpenAI, Google, xAI, OpenRouter, Groq, Mistral, Cerebras, MiniMax, Amazon Bedrock, Azure OpenAI, Google Vertex, and Hugging Face. See the full list with slugs and model examples in the LLM & Providers reference.

LLM Gateways

Polpo works with any OpenAI-compatible gateway. Configure the gateway as a custom provider in .polpo/polpo.json: LiteLLM
{
  "providers": {
    "litellm": {
      "baseUrl": "https://my-litellm-proxy.com/v1",
      "api": "openai-completions"
    }
  }
}
"model": "litellm:gpt-4o"
Vercel AI Gateway
{
  "providers": {
    "vercel": {
      "baseUrl": "https://gateway.ai.vercel.app/v1",
      "api": "openai-completions"
    }
  }
}
"model": "vercel:anthropic/claude-sonnet-4-5"
The model ID is passed as-is to the gateway — no need to define every model in the config. See LLM & Providers reference for the full custom provider configuration.

Local development

For local development, set the key as an environment variable instead:
export ANTHROPIC_API_KEY=sk-ant-...
polpo start
Or add it to .polpo/.env:
ANTHROPIC_API_KEY=sk-ant-...

Deploy syncs keys

When you run polpo deploy, local environment keys are detected and pushed to the cloud automatically:
$ polpo deploy

Detected: ANTHROPIC_API_KEY, XAI_API_KEY
Push LLM keys to cloud? (y/n): y

How it works at runtime

  1. Agent calls the LLM with its configured model
  2. Polpo decrypts the provider key and injects it into the sandbox
  3. The LLM call uses your key directly — no proxy
  4. When the sandbox is destroyed, the decrypted key is discarded