LangChain MistralAI Integration

1.1.2 · active · verified Sun Apr 12

langchain-mistralai is an integration package connecting Mistral AI's powerful models with the LangChain framework. It provides classes for interacting with Mistral chat models (`ChatMistralAI`) and embedding models (`MistralAIEmbeddings`). The library is actively maintained as part of the broader LangChain ecosystem and is currently at version 1.1.2.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize `ChatMistralAI` and use it to invoke a chat completion. It emphasizes setting the `MISTRAL_API_KEY` environment variable for authentication, which is crucial for interacting with the Mistral API.

import os
from langchain_mistralai import ChatMistralAI
from langchain_core.messages import HumanMessage, SystemMessage

# Ensure MISTRAL_API_KEY is set in your environment
# Example: os.environ['MISTRAL_API_KEY'] = 'your_mistral_api_key'

# For quickstart, we use os.environ.get with a default empty string for CI/CD environments.
# In production, ensure the key is properly set.
mistral_api_key = os.environ.get('MISTRAL_API_KEY', '')

if not mistral_api_key:
    print("Warning: MISTRAL_API_KEY environment variable not set. Please set it to run the example.")
else:
    chat = ChatMistralAI(
        model="mistral-large-latest", 
        temperature=0,
        mistral_api_key=mistral_api_key # Pass key directly or rely on env var
    )

    messages = [
        SystemMessage(content="You are a helpful assistant that translates English to French."),
        HumanMessage(content="I love programming."),
    ]

    try:
        response = chat.invoke(messages)
        print(response.content)
    except Exception as e:
        print(f"An error occurred: {e}")

view raw JSON →