LLM Tracekit
raw JSON → 2.8.1 verified Fri May 01 auth: no python
Meta-package for LLM Tracekit - OpenTelemetry instrumentations for LLM providers. Current version: 2.8.1. Release cadence: frequent, follows semantic versioning.
pip install llm-tracekit Common errors
error ModuleNotFoundError: No module named 'llm_tracekit' ↓
cause Package not installed; typo in package name llm-tracekit vs llm_tracekit
fix
Install: pip install llm-tracekit. Import uses underscores: import llm_tracekit
error ImportError: cannot import name 'Tracekit' from 'llm_tracekit' ↓
cause Using old API from v1.x (from llm_tracekit.core import Tracekit) with v2.x
fix
Use: from llm_tracekit import Tracekit
error TypeError: Tracekit() missing 1 required positional argument: 'service_name' ↓
cause service_name is required since v2.5.0
fix
Provide service_name or set LLM_TRACEKIT_SERVICE_NAME environment variable
Warnings
gotcha If you install llm-tracekit without opentelemetry-sdk, console export may fail silently. ↓
fix Install opentelemetry-sdk: pip install opentelemetry-sdk
breaking Tracekit class moved from llm_tracekit.core to llm_tracekit in v2.0. Old import breaks. ↓
fix Change 'from llm_tracekit.core import Tracekit' to 'from llm_tracekit import Tracekit'
gotcha Setting enable_console=True may cause high log volume in production. ↓
fix Use only for debugging; set via environment variable OTEL_TRACES_EXPORTER=console for consistency.
Imports
- Tracekit wrong
from llm_tracekit import LLMTracekitcorrectfrom llm_tracekit import Tracekit - enable_instrumentations wrong
from llm_tracekit.instrument import enablecorrectfrom llm_tracekit import enable_instrumentations
Quickstart
import os
from llm_tracekit import Tracekit
tk = Tracekit(
service_name="my-llm-app",
otlp_endpoint=os.environ.get('OTEL_EXPORTER_OTLP_ENDPOINT', 'http://localhost:4317'),
enable_console=True
)
tk.start()
# Use with OpenAI, Anthropic, etc.