LangChain MistralAI Integration
langchain-mistralai is an integration package connecting Mistral AI's powerful models with the LangChain framework. It provides classes for interacting with Mistral chat models (`ChatMistralAI`) and embedding models (`MistralAIEmbeddings`). The library is actively maintained as part of the broader LangChain ecosystem and is currently at version 1.1.2.
Warnings
- gotcha API Key Management: Always ensure your `MISTRAL_API_KEY` is correctly set as an environment variable or passed explicitly to the model constructor. Forgetting to set it or providing an invalid key will result in authentication errors.
- breaking LangChain Modularization and Import Paths: LangChain underwent significant architectural changes, particularly with versions 0.1.0 and 1.0. Older tutorials or code snippets might use deprecated import paths (e.g., `from langchain.llms import MistralAI`). Always use the dedicated integration package for imports, like `from langchain_mistralai import ChatMistralAI`.
- gotcha Mistral API 400 Errors for Chat/Embeddings: Encountering `httpx.HTTPStatusError: Error response 400` can indicate various issues, including invalid request parameters, rate limiting, or malformed input. For chat models, ensure messages alternate between 'human' and 'assistant' and do not end with an 'assistant' or 'system' message, and that assistant messages don't have both content and tool_calls.
- gotcha Silent Dropping of Citation Metadata in ChatMistralAI: When using Mistral's native citation feature (e.g., `citations=True` in the raw API), `ChatMistralAI` might currently treat the response content as a plain string, potentially dropping detailed citation metadata (references, source mapping).
Install
-
pip install langchain-mistralai
Imports
- ChatMistralAI
from langchain_mistralai import ChatMistralAI
- MistralAIEmbeddings
from langchain_mistralai import MistralAIEmbeddings
Quickstart
import os
from langchain_mistralai import ChatMistralAI
from langchain_core.messages import HumanMessage, SystemMessage
# Ensure MISTRAL_API_KEY is set in your environment
# Example: os.environ['MISTRAL_API_KEY'] = 'your_mistral_api_key'
# For quickstart, we use os.environ.get with a default empty string for CI/CD environments.
# In production, ensure the key is properly set.
mistral_api_key = os.environ.get('MISTRAL_API_KEY', '')
if not mistral_api_key:
print("Warning: MISTRAL_API_KEY environment variable not set. Please set it to run the example.")
else:
chat = ChatMistralAI(
model="mistral-large-latest",
temperature=0,
mistral_api_key=mistral_api_key # Pass key directly or rely on env var
)
messages = [
SystemMessage(content="You are a helpful assistant that translates English to French."),
HumanMessage(content="I love programming."),
]
try:
response = chat.invoke(messages)
print(response.content)
except Exception as e:
print(f"An error occurred: {e}")