Provider configuration lives in the providers section of your polpo.json file. This is where you set API keys, override base URLs, and register custom providers.
Basic Configuration
String Shorthand
The simplest form — just the API key:
{
"providers": {
"anthropic": "${ANTHROPIC_API_KEY}",
"openai": "sk-proj-..."
}
}
The ${ENV_VAR} syntax references an environment variable, so you don’t need to hardcode secrets in config files.
The full object form lets you set API key, base URL, API compatibility mode, and custom model definitions:
{
"providers": {
"anthropic": {
"apiKey": "${ANTHROPIC_API_KEY}"
},
"openai": {
"apiKey": "${OPENAI_API_KEY}",
"baseUrl": "https://my-proxy.example.com/v1"
}
}
}
Configuration Reference
interface ProviderConfig {
/** API key (direct value or "${ENV_VAR}" reference). */
apiKey?: string;
/** Override base URL for the provider. */
baseUrl?: string;
/** API compatibility mode for custom endpoints. */
api?: "openai-completions" | "openai-responses" | "anthropic-messages";
/** Custom model definitions (for non-catalog providers). */
models?: CustomModelDef[];
}
apiKey
The API key to authenticate with the provider. Supports two formats:
- Direct value:
"sk-ant-api03-..."
- Env reference:
"${ANTHROPIC_API_KEY}" — resolved at startup from the environment
Resolution order:
providers.{id}.apiKey in polpo.json
- Standard environment variable for the provider (see Providers)
baseUrl
Override the default API endpoint. Use this for:
- Proxies — route through a corporate proxy or LiteLLM
- Self-hosted — point to Ollama, vLLM, or LM Studio
- Regional endpoints — use a specific region’s API endpoint
{
"providers": {
"openai": {
"apiKey": "${OPENAI_API_KEY}",
"baseUrl": "https://litellm.internal.company.com/v1"
}
}
}
api
API compatibility mode for custom endpoints. Required when the endpoint uses a different API format than the provider’s default:
| Value | Description | Use For |
|---|
openai-completions | OpenAI Chat Completions API | Ollama, vLLM, LM Studio, LiteLLM, any OpenAI-compatible server |
openai-responses | OpenAI Responses API | Newer OpenAI endpoints |
anthropic-messages | Anthropic Messages API | Anthropic-compatible proxies |
{
"providers": {
"ollama": {
"baseUrl": "http://localhost:11434/v1",
"api": "openai-completions"
}
}
}
models
Custom model definitions for providers not in the built-in catalog. See Custom Providers for details.
Environment Variables
Every provider has a standard environment variable. If you only need to set API keys via env vars, you don’t need the providers section at all — Polpo reads them automatically.
# These are auto-detected, no polpo.json config needed
export ANTHROPIC_API_KEY=sk-ant-...
export OPENAI_API_KEY=sk-...
export GEMINI_API_KEY=AIza...
export GROQ_API_KEY=gsk_...
The providers section is only needed when you want to:
- Use
${ENV_VAR} references with non-standard variable names
- Set
baseUrl for custom endpoints
- Register custom providers with
api and models
- Override keys that differ from the standard env var
Proxy Configuration
LiteLLM
Route all providers through a LiteLLM proxy:
{
"providers": {
"anthropic": {
"apiKey": "${LITELLM_API_KEY}",
"baseUrl": "https://litellm.internal.company.com"
},
"openai": {
"apiKey": "${LITELLM_API_KEY}",
"baseUrl": "https://litellm.internal.company.com"
}
}
}
Corporate Proxy
For environments that require HTTP proxies, set the standard environment variables:
export HTTP_PROXY=http://proxy.company.com:8080
export HTTPS_PROXY=http://proxy.company.com:8080
export NO_PROXY=localhost,127.0.0.1
Validation
Polpo validates provider configuration at startup:
- API key check — warns about models referenced by agents that don’t have API keys
- Base URL validation — checks that custom base URLs are reachable (when possible)
- Model catalog check — verifies that referenced models exist in the provider’s catalog (custom models are always valid)
Validation results are logged at startup:
[polpo] Provider validation:
anthropic: ANTHROPIC_API_KEY ✓
openai: OPENAI_API_KEY ✓
groq: GROQ_API_KEY ✗ (set GROQ_API_KEY or add to providers config)
Missing API keys emit warnings but don’t block startup. The error will surface when an agent actually tries to use that provider. Use the detailed validation to catch issues early.
Complete Example
{
"providers": {
"anthropic": "${ANTHROPIC_API_KEY}",
"openai": {
"apiKey": "${OPENAI_API_KEY}"
},
"google": "${GEMINI_API_KEY}",
"groq": "${GROQ_API_KEY}",
"ollama": {
"baseUrl": "http://localhost:11434/v1",
"api": "openai-completions",
"models": [
{
"id": "qwen2.5-coder:32b",
"name": "Qwen 2.5 Coder 32B",
"reasoning": false,
"input": ["text"],
"contextWindow": 131072,
"maxTokens": 8192,
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }
}
]
}
},
"agents": [
{ "name": "coder", "model": "anthropic:claude-sonnet-4-6" },
{ "name": "reviewer", "model": "openai:o3" },
{ "name": "fast-worker", "model": "groq:llama-3.3-70b-versatile" },
{ "name": "local", "model": "ollama:qwen2.5-coder:32b" }
],
"settings": {
"orchestratorModel": {
"primary": "anthropic:claude-sonnet-4-6",
"fallbacks": ["openai:gpt-4o", "google:gemini-2.5-pro"]
}
}
}