Connecting OpenClaw to OpenAI’s API unlocks access to industry-leading models like GPT-4o and GPT-5, bringing powerful reasoning and code generation capabilities to your workflow.
- Direct API access eliminates intermediary latency while providing the full suite of OpenAI’s continuously improving model lineup.
- Developers frequently encounter confusion around authentication methods and model naming conventions when setting up their first integration.
- A seamless connection that routes your agent requests through OpenAI’s infrastructure with proper usage tracking and billing transparency.
The fastest way to get started involves running the interactive onboarding wizard, which handles credential storage and model selection in one step.

openclaw onboard --auth-choice openai-api-key
For automated deployments or CI pipelines, skip the interactive prompts by passing your key directly as an argument.
openclaw onboard --openai-api-key "$OPENAI_API_KEY"
Manual configuration through openclaw.json gives you granular control over model selection and environment variables.
{
"env": { "OPENAI_API_KEY": "sk-..." },
"agents": {
"defaults": {
"model": { "primary": "openai/gpt-5.4" }
}
}
}
When using the Codex-specific provider, remember to prefix your model references accordingly for proper routing.
{
"agents": {
"defaults": {
"model": { "primary": "openai-codex/gpt-5.4" }
}
}
}
OpenAI integration positions OpenClaw at the forefront of commercially available language models, with automatic access to improvements as they roll out.
