Accessing multiple AI providers through a single unified API simplifies your Openclaw configuration while expanding model options.
- One API key unlocks dozens of models from Anthropic, OpenAI, Google, and more without separate accounts.
- Developers often struggle with provider-specific configurations when trying to switch between models.
- A streamlined setup that routes all inference through OpenRouter’s aggregation layer with automatic fallback handling.
OpenRouter acts as a universal gateway for AI models. Instead of managing multiple API keys and endpoints, you configure once and access everything.

Start by obtaining your API key from the OpenRouter dashboard. The key follows the format sk-or-…
Step 1: Configure via CLI
Run the onboarding wizard with the OpenRouter provider option:
openclaw onboard --auth-choice apiKey --token-provider openrouter --token "$OPENROUTER_API_KEY"
Step 2: Manual Configuration
For persistent settings, edit your ~/.openclaw/openclaw.json:
{
"env": { "OPENROUTER_API_KEY": "sk-or-..." },
"agents": {
"defaults": {
"model": { "primary": "openrouter/anthropic/claude-sonnet-4-5" }
}
}
}
Available Models
OpenRouter supports numerous models including:
- openrouter/anthropic/claude-sonnet-4-5
- openrouter/openai/gpt-4o
- openrouter/google/gemini-pro
- openrouter/meta/llama-3.3-70b
Troubleshooting & Best Practices
- Always prefix model IDs with openrouter/ to ensure proper routing.
- Monitor your OpenRouter dashboard for usage and rate limits.
- Use environment variables for API keys in production environments.
OpenRouter integration centralizes your model access, reducing configuration overhead while maintaining flexibility to switch between providers instantly.
