Adding Mistral models to your OpenClaw setup gives you access to efficient European LLMs. Here’s what you need to know:
- Mistral delivers strong performance with lower latency and competitive pricing compared to many frontier models.
- Developers frequently miss the environment variable naming convention, causing silent authentication failures.
- You will learn how to configure your Mistral API key and set it as your primary model in OpenClaw.

Mistral AI offers a family of capable language models through a straightforward API. Their models are known for efficiency and solid reasoning capabilities.
Obtain your API key from the Mistral console and configure OpenClaw. The interactive wizard handles this smoothly:
openclaw onboard --auth-choice mistral-api-key
For scripting environments, use the non-interactive flag:
openclaw onboard --mistral-api-key "$MISTRAL_API_KEY"
Then set Mistral as your default provider in openclaw.json:
{
"env": { "MISTRAL_API_KEY": "sk-..." },
"agents": {
"defaults": {
"model": { "primary": "mistral/mistral-large-latest" }
}
}
}
The base URL defaults to https://api.mistral.ai/v1. Restart your gateway and Mistral models are ready for inference.
