Mistral AI API
Mistral AI API
Mistral AI API provides Mistral Large / Mistral Small / Mistral NeMo / Codestral — European open-source / hybrid LLMs, the European answer to OpenAI.
Partial models fully open-source (Mistral 7B, Mixtral 8x7B, Mistral NeMo)
Strongest model Mistral Large still lags GPT-4 / Claude 3.5
Sign up at console.mistral.ai for API key. POST https://api.mistral.ai/v1/chat/completions with model: "mistral-large-latest" + messages.
Uptime · 30-day window
About this API
Mistral AI is a French LLM company founded 2023 (founding team from Meta / DeepMind), became a unicorn in three months. Hailed as "European OpenAI" — actively supported by EU government as core to European AI sovereignty strategy. Product line: (1) Mistral 7B (open-sourced 2023, once best small open model); (2) Mixtral 8x7B and Mixtral 8x22B (mixture-of-experts open models, performance near GPT-3.5); (3) Mistral Large (flagship closed model, quality approaching but below GPT-4); (4) Codestral (code generation specialist); (5) Mistral NeMo (open-source 12B model partnered with NVIDIA). API is practical and modern — function calling, JSON mode, streaming all supported. European enterprises prefer Mistral because datacenters are in EU (GDPR-friendly). Less used by Chinese customers; mainly European market.
What you can build
- 1European compliance (data stays in EU)
- 2Function calling and code generation
- 3Mid-size model deployment on own GPUs (Mistral 7B/8x7B fully open-source)
Strengths & limitations
Strengths
- Partial models fully open-source (Mistral 7B, Mixtral 8x7B, Mistral NeMo)
- European datacenters (GDPR-friendly)
- Mid-range pricing (Mistral Large $2/$6 per M)
Limitations
- Strongest model Mistral Large still lags GPT-4 / Claude 3.5
- API ecosystem weaker than OpenAI
Example request
curl https://mistral.ai/<endpoint> \
-H "Authorization: Bearer $API_KEY"
# Some providers use X-Api-Key instead — verify in the docs.Getting started
Sign up at console.mistral.ai for API key. POST https://api.mistral.ai/v1/chat/completions with model: "mistral-large-latest" + messages.
FAQ
How do I use the open-source models?+
Mistral 7B etc. run locally with ollama / Hugging Face transformers. Mistral Large is closed-source API-only.
How is Chinese support?+
Chinese quality is average (lags China-domestic models like Qwen). For multilingual tasks, recommend Cohere or GPT-4.
Technical details
- Auth type
- api_key
- Pricing
- paid
- Rate limit
- free trial 1 RPM;付费 tier 提升到 100+ RPM
- Protocols
- REST
- SDKs
- python, typescript, javascript
- Response time
- 609 ms
- Last health check
- 5/12/2026, 7:37:53 AM