LlamaIndex Mistral AI LLM Integration

0.10.1 · active · verified Fri Apr 17

This library provides the integration for Mistral AI's Large Language Models (LLMs) within the LlamaIndex framework. It allows LlamaIndex applications to leverage Mistral AI's powerful models for tasks like text completion and chat. The current version is 0.10.1, and being part of the LlamaIndex ecosystem, it follows a relatively rapid release cadence, often aligning with LlamaIndex core updates.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the Mistral AI LLM, set the API key, and perform both text completion and chat interactions using the `llama-index-llms-mistralai` integration. It highlights the importance of setting the `MISTRAL_API_KEY` environment variable and shows how to select a specific Mistral model.

import os
from llama_index.llms.mistralai import MistralAI
from llama_index.core import Settings

# Ensure MISTRAL_API_KEY is set in your environment
# For local testing, you might do: os.environ["MISTRAL_API_KEY"] = "your_api_key_here"
api_key = os.environ.get('MISTRAL_API_KEY')

if not api_key:
    raise ValueError("MISTRAL_API_KEY environment variable not set. Please set it to your Mistral AI API key.")

# Initialize the MistralAI LLM
llm = MistralAI(
    model="mistral-tiny", # Choose your desired model, e.g., 'mistral-small', 'mistral-medium', 'mistral-large-latest'
    api_key=api_key
)

# Optionally set as default LLM for LlamaIndex
Settings.llm = llm

# Use the LLM for completion
response = llm.complete("What is the capital of France?")
print(f"MistralAI LLM Response: {response.text}")

# Use the LLM for chat (if supported by model)
chat_response = llm.chat([
    {"role": "user", "content": "What is your favorite color?"},
    {"role": "assistant", "content": "As an AI, I don't have preferences. What's yours?"},
    {"role": "user", "content": "Mine is blue."}
])
print(f"MistralAI Chat Response: {chat_response.message.content}")

view raw JSON →