OpenTelemetry Mistral AI Instrumentation

0.58.0 · active · verified Fri Apr 10

This library provides OpenTelemetry auto-instrumentation for the official Mistral AI Python SDK, allowing for tracing of LLM calls, capturing performance, token usage, and other observability data. It is part of the broader OpenLLMetry project and sees frequent releases, often on a weekly or bi-weekly basis, to keep up with changes in semantic conventions and underlying LLM SDKs.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize the OpenTelemetry SDK with a console exporter and then apply the `MistralAiInstrumentor` to automatically trace calls made through the Mistral AI Python SDK. Ensure `MISTRAL_API_KEY` is set in your environment for successful API calls. The `ConsoleSpanExporter` prints telemetry data directly to the console.

import os
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
from opentelemetry.instrumentation.mistralai import MistralAiInstrumentor
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

# Configure OpenTelemetry SDK
resource = Resource.create({"service.name": "mistralai-app"})
tracer_provider = TracerProvider(resource=resource)
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(tracer_provider)

# Instrument the Mistral AI library
MistralAiInstrumentor().instrument()

# Set your Mistral AI API key (replace with your actual key or env var)
mistral_api_key = os.environ.get("MISTRAL_API_KEY", "YOUR_MISTRAL_API_KEY")

if mistral_api_key == "YOUR_MISTRAL_API_KEY":
    print("Warning: MISTRAL_API_KEY environment variable not set. Using a placeholder.")
    print("Please set MISTRAL_API_KEY to run a real Mistral AI call.")
else:
    client = MistralClient(api_key=mistral_api_key)

    try:
        # Example Mistral AI API call
        chat_response = client.chat(
            model="mistral-tiny",
            messages=[
                ChatMessage(role="user", content="What is the capital of France?")
            ]
        )
        print(f"Mistral AI Response: {chat_response.choices[0].message.content}")
    except Exception as e:
        print(f"Error making Mistral AI call: {e}")

print("Instrumentation initialized. Check console for OpenTelemetry spans.")

view raw JSON →