OpenLit

1.40.3 · active · verified Sat Apr 11

OpenLit is an OpenTelemetry-native auto-instrumentation library for monitoring LLM applications and GPUs, facilitating the integration of observability into GenAI projects. It offers automatic tracing, metrics, and evaluations for over 50 LLM providers, frameworks, and vector databases. The library is actively maintained with frequent releases, currently at version 1.40.3.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize OpenLit for automatic instrumentation of an OpenAI LLM call. Ensure `openlit.init()` is called before any LLM client instantiation. By default, if `OTEL_EXPORTER_OTLP_ENDPOINT` is not set, traces will be printed to the console for development purposes. For production, configure the `OTEL_EXPORTER_OTLP_ENDPOINT` and authentication headers.

import os
import openlit
from openai import OpenAI

# Configure OpenLIT (either via env vars or direct arguments to init)
# For local development, omitting otlp_endpoint will print traces to console.
os.environ['OPENLIT_APPLICATION_NAME'] = os.environ.get('OPENLIT_APPLICATION_NAME', 'my-genai-app')
os.environ['OTEL_EXPORTER_OTLP_ENDPOINT'] = os.environ.get('OTEL_EXPORTER_OTLP_ENDPOINT', 'http://127.0.0.1:4318')
os.environ['OPENAI_API_KEY'] = os.environ.get('OPENAI_API_KEY', 'YOUR_OPENAI_API_KEY') # Replace with actual key or set env var

# Initialize OpenLIT for auto-instrumentation
# Make sure this call happens *before* importing/instantiating LLM clients
openlit.init()

# Example with OpenAI
client = OpenAI()

try:
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[{"role": "user", "content": "What is OpenTelemetry?"}]
    )
    print(response.choices[0].message.content)
except Exception as e:
    print(f"An error occurred: {e}")
    print("Ensure OPENAI_API_KEY is set and OTLP endpoint is reachable if not using console output.")

view raw JSON →