LlamaIndex Mistral AI LLM Integration
This library provides the integration for Mistral AI's Large Language Models (LLMs) within the LlamaIndex framework. It allows LlamaIndex applications to leverage Mistral AI's powerful models for tasks like text completion and chat. The current version is 0.10.1, and being part of the LlamaIndex ecosystem, it follows a relatively rapid release cadence, often aligning with LlamaIndex core updates.
Common errors
-
ModuleNotFoundError: No module named 'mistralai'
cause The underlying `mistralai` Python client library, which is a dependency for `llama-index-llms-mistralai`, has not been installed.fixInstall the `mistralai` package: `pip install mistralai`. -
ValueError: MISTRAL_API_KEY environment variable not set. Please set it to your Mistral AI API key.
cause The `MistralAI` LLM initialization requires the API key, which by default is read from the `MISTRAL_API_KEY` environment variable. This error occurs when the variable is missing or empty.fixSet the environment variable `MISTRAL_API_KEY` to your valid Mistral AI API key. Example: `export MISTRAL_API_KEY='YOUR_KEY_HERE'` in your shell or pass it directly `MistralAI(api_key="YOUR_KEY")`. -
mistralai.exceptions.MistralException: Model 'non-existent-model' not found.
cause You have specified a model name in the `MistralAI` constructor that does not exist or is not valid for the Mistral AI API.fixCheck the Mistral AI documentation for the correct and current list of available model identifiers (e.g., `mistral-tiny`, `mistral-small`, `mistral-medium`, `mistral-large-latest`).
Warnings
- breaking LlamaIndex's rapid development cycle can lead to breaking changes in core components that might affect integrations. While `llama-index-llms-mistralai` aims to keep up, ensure compatibility with your `llama-index-core` version.
- gotcha The `MISTRAL_API_KEY` environment variable must be set for the MistralAI LLM to authenticate correctly. Forgetting to set it or using an incorrect variable name is a common oversight.
- gotcha Mistral AI offers various models (e.g., `mistral-tiny`, `mistral-small`, `mistral-medium`, `mistral-large-latest`). The `model` parameter expects specific string identifiers, and using an incorrect or deprecated model name will result in an API error.
Install
-
pip install llama-index-llms-mistralai
Imports
- MistralAI
from llama_index.llms import MistralAI
from llama_index.llms.mistralai import MistralAI
Quickstart
import os
from llama_index.llms.mistralai import MistralAI
from llama_index.core import Settings
# Ensure MISTRAL_API_KEY is set in your environment
# For local testing, you might do: os.environ["MISTRAL_API_KEY"] = "your_api_key_here"
api_key = os.environ.get('MISTRAL_API_KEY')
if not api_key:
raise ValueError("MISTRAL_API_KEY environment variable not set. Please set it to your Mistral AI API key.")
# Initialize the MistralAI LLM
llm = MistralAI(
model="mistral-tiny", # Choose your desired model, e.g., 'mistral-small', 'mistral-medium', 'mistral-large-latest'
api_key=api_key
)
# Optionally set as default LLM for LlamaIndex
Settings.llm = llm
# Use the LLM for completion
response = llm.complete("What is the capital of France?")
print(f"MistralAI LLM Response: {response.text}")
# Use the LLM for chat (if supported by model)
chat_response = llm.chat([
{"role": "user", "content": "What is your favorite color?"},
{"role": "assistant", "content": "As an AI, I don't have preferences. What's yours?"},
{"role": "user", "content": "Mine is blue."}
])
print(f"MistralAI Chat Response: {chat_response.message.content}")