{"id":4602,"library":"langchain-mistralai","title":"LangChain MistralAI Integration","description":"langchain-mistralai is an integration package connecting Mistral AI's powerful models with the LangChain framework. It provides classes for interacting with Mistral chat models (`ChatMistralAI`) and embedding models (`MistralAIEmbeddings`). The library is actively maintained as part of the broader LangChain ecosystem and is currently at version 1.1.2.","status":"active","version":"1.1.2","language":"en","source_language":"en","source_url":"https://github.com/langchain-ai/langchain","tags":["LLM","Mistral AI","LangChain","AI","integration","chat","embeddings","generative AI"],"install":[{"cmd":"pip install langchain-mistralai","lang":"bash","label":"Install latest version"}],"dependencies":[{"reason":"All LangChain integration packages depend on langchain-core for core functionalities and abstractions.","package":"langchain-core","optional":false},{"reason":"This package is an integration layer and directly uses the official Mistral AI SDK.","package":"mistralai","optional":false}],"imports":[{"note":"Following LangChain's modularization (post v0.1.0 and v1.0), chat models are imported directly from their respective integration packages, not the top-level `langchain` package.","wrong":"from langchain.chat_models import ChatMistralAI","symbol":"ChatMistralAI","correct":"from langchain_mistralai import ChatMistralAI"},{"note":"Similar to chat models, embedding models are imported from their dedicated integration packages after LangChain's modular updates.","wrong":"from langchain.embeddings import MistralAIEmbeddings","symbol":"MistralAIEmbeddings","correct":"from langchain_mistralai import MistralAIEmbeddings"}],"quickstart":{"code":"import os\nfrom langchain_mistralai import ChatMistralAI\nfrom langchain_core.messages import HumanMessage, SystemMessage\n\n# Ensure MISTRAL_API_KEY is set in your environment\n# Example: os.environ['MISTRAL_API_KEY'] = 'your_mistral_api_key'\n\n# For quickstart, we use os.environ.get with a default empty string for CI/CD environments.\n# In production, ensure the key is properly set.\nmistral_api_key = os.environ.get('MISTRAL_API_KEY', '')\n\nif not mistral_api_key:\n    print(\"Warning: MISTRAL_API_KEY environment variable not set. Please set it to run the example.\")\nelse:\n    chat = ChatMistralAI(\n        model=\"mistral-large-latest\", \n        temperature=0,\n        mistral_api_key=mistral_api_key # Pass key directly or rely on env var\n    )\n\n    messages = [\n        SystemMessage(content=\"You are a helpful assistant that translates English to French.\"),\n        HumanMessage(content=\"I love programming.\"),\n    ]\n\n    try:\n        response = chat.invoke(messages)\n        print(response.content)\n    except Exception as e:\n        print(f\"An error occurred: {e}\")","lang":"python","description":"This quickstart demonstrates how to initialize `ChatMistralAI` and use it to invoke a chat completion. It emphasizes setting the `MISTRAL_API_KEY` environment variable for authentication, which is crucial for interacting with the Mistral API."},"warnings":[{"fix":"Set `os.environ['MISTRAL_API_KEY'] = 'your_key'` before initialization, or pass `mistral_api_key='your_key'` to the `ChatMistralAI` or `MistralAIEmbeddings` constructor.","message":"API Key Management: Always ensure your `MISTRAL_API_KEY` is correctly set as an environment variable or passed explicitly to the model constructor. Forgetting to set it or providing an invalid key will result in authentication errors.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Update import statements to reflect the modular structure of LangChain, importing directly from `langchain_mistralai` for Mistral-specific components. Refer to the official LangChain documentation for the correct import paths.","message":"LangChain Modularization and Import Paths: LangChain underwent significant architectural changes, particularly with versions 0.1.0 and 1.0. Older tutorials or code snippets might use deprecated import paths (e.g., `from langchain.llms import MistralAI`). Always use the dedicated integration package for imports, like `from langchain_mistralai import ChatMistralAI`.","severity":"breaking","affected_versions":"Versions < 0.1.0 (for old `langchain` package imports), 0.1.0 onwards (for modular imports)"},{"fix":"Review API key validity, check for rate limits, validate input data format, and ensure message history adheres to the expected conversational structure for Mistral AI models. For embeddings, consider batching documents if processing large volumes.","message":"Mistral API 400 Errors for Chat/Embeddings: Encountering `httpx.HTTPStatusError: Error response 400` can indicate various issues, including invalid request parameters, rate limiting, or malformed input. For chat models, ensure messages alternate between 'human' and 'assistant' and do not end with an 'assistant' or 'system' message, and that assistant messages don't have both content and tool_calls.","severity":"gotcha","affected_versions":"All versions"},{"fix":"If citation details are critical for RAG, consider accessing the Mistral AI SDK directly or implementing a post-processing step to re-parse the raw API response until an official fix is released in `langchain-mistralai`. Monitor issue #36427 on GitHub for updates.","message":"Silent Dropping of Citation Metadata in ChatMistralAI: When using Mistral's native citation feature (e.g., `citations=True` in the raw API), `ChatMistralAI` might currently treat the response content as a plain string, potentially dropping detailed citation metadata (references, source mapping).","severity":"gotcha","affected_versions":"Versions prior to a fix for #36427 (as of April 2026, this is an open issue)."}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}