Mistral AI Python SDK
raw JSON → 1.12.4 verified Tue May 12 auth: no python install: stale quickstart: stale
Official Python SDK for Mistral AI API. Has gone through two major breaking rewrites: v0 → v1 (MistralClient removed) and v1 → v2 (in pre-release on GitHub, not yet on PyPI). Most LLM-generated code references v0 patterns which no longer work.
pip install mistralai Common errors
error ModuleNotFoundError: No module named 'mistralai.models.chat_completion' ↓
cause This error typically occurs when using an outdated version of the `mistralai` library (like `0.4.2` or older) or when the `ChatMessage` model has been moved or refactored in newer versions (v1 and above) to be directly accessible from `mistralai.client` or through a simpler structure.
fix
Upgrade the
mistralai library to the latest version using pip install --upgrade mistralai. Then, import ChatMessage directly from mistralai.client or ensure your code adapts to the current API structure, which often places ChatMessage directly in mistralai.client or uses plain dictionaries for messages. error AttributeError: 'MistralClient' object has no attribute 'chat_stream' ↓
cause This error arises because the method for interacting with the chat completion API, especially for streaming, changed in newer versions (v1 and beyond). Direct methods like `client.chat_stream()` or `client.chat()` were removed or restructured.
fix
Access chat completion methods through the
chat attribute of the MistralClient instance, for example, client.chat.stream() for streaming or client.chat.completions() for non-streaming. error TypeError: 'type' object is not subscriptable ↓
cause This error often occurs when running the `mistralai` library with older Python versions (specifically Python 3.8 or below) where type hints like `type[CustomPydanticModel]` used within the library's internal `extra` modules are not supported. This indicates a compatibility issue with the Python version and the library's type hinting syntax.
fix
Upgrade your Python environment to a supported version, preferably Python 3.9 or higher. If upgrading Python is not immediately possible, consider pinning an older, compatible version of the
mistralai library if one exists that supports Python 3.8. error TypeError: 'async for' requires an object with __aiter__ method, got coroutine ↓
cause This error occurs when attempting to use `async for` on a coroutine object that is not an asynchronous iterator. It typically means an asynchronous function (coroutine) that returns a single result (or another coroutine) is being incorrectly treated as an async iterable (something that can be iterated over with `async for` to yield multiple items). This can happen if a streaming method returns a coroutine that needs to be awaited once to get the iterable, or if the method itself is not designed to be an async iterable.
fix
Ensure that the object you are trying to iterate over with
async for is indeed an asynchronous iterable. If the method returns a coroutine, you might need to await it first to get the actual iterable object, or if it's a single result, process it directly after await. Check the library's documentation for the correct way to handle asynchronous streaming responses. error AttributeError: 'ChatMessage' object has no attribute 'model_dump' ↓
cause This error signifies that the `ChatMessage` object, likely from an older or incompatible version of the `mistralai` models, does not have the `model_dump` method. `model_dump` is a Pydantic V2 feature, and if `ChatMessage` is based on Pydantic V1 or a different serialization method, this attribute will be missing.
fix
Ensure your
mistralai library and its dependencies (like Pydantic) are up to date via pip install --upgrade mistralai pydantic. If the error persists, it means the ChatMessage object in your specific mistralai version expects a different serialization method (e.g., dict() or json() for Pydantic V1 models, or model_dump_json() for V2) or that the message structure should be a plain dictionary if the library expects it that way. Warnings
breaking MistralClient and MistralAsyncClient are removed. Use Mistral class for both sync and async. ↓
fix Replace MistralClient(api_key=...) with Mistral(api_key=...). Same constructor signature.
breaking client.chat() call signature removed. Now client.chat.complete() for sync, client.chat.complete_async() for async. ↓
fix Change client.chat(model=m, messages=msgs) to client.chat.complete(model=m, messages=msgs)
breaking Streaming chunk structure changed. chunk.choices[0] no longer works — must use chunk.data.choices[0]. ↓
fix Add .data between chunk and .choices in all streaming handlers
breaking ChatMessage class removed from mistralai.models.chat_completion. Use plain dicts or new typed message classes. ↓
fix Replace ChatMessage(role='user', content='...') with {'role': 'user', 'content': '...'}
breaking tool_call_id is now mandatory in messages with role 'tool'. Omitting it causes API error. ↓
fix Always include tool_call_id in tool response messages
gotcha pip install mistralai installs v1 (stable). GitHub main branch shows v2 docs which are NOT on PyPI yet. Don't follow GitHub README — it documents unreleased v2. ↓
fix Use PyPI docs at pypi.org/project/mistralai for v1 reference. GitHub main = v2 pre-release.
gotcha Agents API requires Python 3.10+ and mistralai[agents] extra. Base install does not include it. ↓
fix pip install 'mistralai[agents]' and ensure Python 3.10+
gotcha mistralai-azure and mistralai-gcp are not separate PyPI packages. Azure/GCP deployments are typically configured via 'base_url' or specific client initialization parameters, not by installing additional packages. ↓
fix Refer to official Mistral AI documentation for configuring Azure/GCP deployments. Do not attempt to 'pip install mistralai-azure' or 'mistralai-gcp' as these packages are not available.
Install
uv add mistralai pip install 'mistralai[agents]' pip install mistralai-azure pip install mistralai-gcp Install compatibility stale last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) agents - - - -
3.10 alpine (musl) mistralai - - - -
3.10 alpine (musl) mistralai-azure - - - -
3.10 alpine (musl) mistralai-gcp - - - -
3.10 slim (glibc) agents - - - -
3.10 slim (glibc) mistralai - - - -
3.10 slim (glibc) mistralai-azure - - - -
3.10 slim (glibc) mistralai-gcp - - - -
3.11 alpine (musl) agents - - - -
3.11 alpine (musl) mistralai - - - -
3.11 alpine (musl) mistralai-azure - - - -
3.11 alpine (musl) mistralai-gcp - - - -
3.11 slim (glibc) agents - - - -
3.11 slim (glibc) mistralai - - - -
3.11 slim (glibc) mistralai-azure - - - -
3.11 slim (glibc) mistralai-gcp - - - -
3.12 alpine (musl) agents - - - -
3.12 alpine (musl) mistralai - - - -
3.12 alpine (musl) mistralai-azure - - - -
3.12 alpine (musl) mistralai-gcp - - - -
3.12 slim (glibc) agents - - - -
3.12 slim (glibc) mistralai - - - -
3.12 slim (glibc) mistralai-azure - - - -
3.12 slim (glibc) mistralai-gcp - - - -
3.13 alpine (musl) agents - - - -
3.13 alpine (musl) mistralai - - - -
3.13 alpine (musl) mistralai-azure - - - -
3.13 alpine (musl) mistralai-gcp - - - -
3.13 slim (glibc) agents - - - -
3.13 slim (glibc) mistralai - - - -
3.13 slim (glibc) mistralai-azure - - - -
3.13 slim (glibc) mistralai-gcp - - - -
3.9 alpine (musl) agents - - 2.75s 136.1M
3.9 alpine (musl) mistralai - - 2.98s 109.9M
3.9 alpine (musl) mistralai-azure - - - -
3.9 alpine (musl) mistralai-gcp - - - -
3.9 slim (glibc) agents - - 2.44s 207M
3.9 slim (glibc) mistralai - - 2.59s 182M
3.9 slim (glibc) mistralai-azure - - - -
3.9 slim (glibc) mistralai-gcp - - - -
Imports
- Mistral wrong
from mistralai.client import MistralClientcorrectfrom mistralai import Mistral - chat completion wrong
client.chat(model=..., messages=[...])correctclient.chat.complete(model=..., messages=[...]) - streaming chunk wrong
chunk.choices[0].delta.contentcorrectchunk.data.choices[0].delta.content - ChatMessage wrong
from mistralai.models.chat_completion import ChatMessagecorrect{'role': 'user', 'content': '...'}
Quickstart stale last tested: 2026-05-12
import os
from mistralai import Mistral
client = Mistral(api_key=os.environ['MISTRAL_API_KEY'])
response = client.chat.complete(
model='mistral-large-latest',
messages=[{'role': 'user', 'content': 'Hello'}]
)
print(response.choices[0].message.content)