Laminar Python SDK
The Laminar Python SDK provides an open-source platform for engineering LLM products, offering capabilities to trace, evaluate, annotate, and analyze LLM data. It helps developers bring their AI applications to production with confidence by providing observability into their LLM interactions. The library is currently at version 0.7.47 and is actively maintained.
Warnings
- gotcha Mixing automatic instrumentation (e.g., via `instrumentModules` or default behavior) with manual span management (e.g., `lmnr_context` or `start_as_current_span`) can lead to unpredictable or duplicated tracing results. It is recommended to choose one instrumentation style per function or module for clarity and correct tracing.
- gotcha Laminar's context management for tracing relies on `contextvars.ContextVar`, which may not propagate context correctly across native Python threads without explicit handling. This can result in incomplete or broken traces in multi-threaded applications.
- gotcha Automatic instrumentation for popular LLM, Vector DB, database, and requests libraries is enabled by default if the `instruments` argument is omitted during `Laminar.initialize()`. To explicitly control or fully disable auto-instrumentation, pass the `instruments` argument.
Install
-
pip install 'lmnr[all]' -
pip install 'lmnr[anthropic,openai]' -
pip install -U 'lmnr[all]' openai python-dotenv
Imports
- Laminar
from lmnr import Laminar
- observe
from lmnr import observe
- wrap_llm_call
from lmnr import wrap_llm_call
- lmnr_context
from lmnr import lmnr_context
Quickstart
import os
import openai
from lmnr import Laminar, observe
# It's recommended to install with pip install -U 'lmnr[all]' openai python-dotenv
# Initialize Laminar. project_api_key will be read from LMNR_PROJECT_API_KEY environment
# variable if not explicitly passed.
Laminar.initialize(
project_api_key=os.environ.get('LMNR_PROJECT_API_KEY', ''),
# Instrument the OpenAI module for automatic tracing of API calls.
instrumentModules={
"OpenAI": openai
}
)
# Initialize OpenAI client *after* Laminar has instrumented the module.
# The OpenAI API key will be read from OPENAI_API_KEY environment variable.
client = openai.OpenAI(api_key=os.environ.get('OPENAI_API_KEY', ''))
@observe(name="poem_generator")
def generate_poem(topic: str) -> str:
"""Generates a short poem using OpenAI and traces the call with Laminar."""
print(f"Calling OpenAI to generate a poem about: {topic}")
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "user", "content": f"Write a short poem about {topic}"},
],
)
return response.choices[0].message.content
if __name__ == "__main__":
# Ensure LMNR_PROJECT_API_KEY and OPENAI_API_KEY are set in your environment
# or a .env file loaded with python-dotenv.
if not os.environ.get('LMNR_PROJECT_API_KEY') or not os.environ.get('OPENAI_API_KEY'):
print("Please set LMNR_PROJECT_API_KEY and OPENAI_API_KEY environment variables.")
print("You can get LMNR_PROJECT_API_KEY from your Laminar dashboard.")
exit(1)
print("Generating a poem with Laminar tracing enabled...")
try:
poem = generate_poem("a starry night")
print("\nGenerated Poem:")
print(poem)
print("\nCheck your Laminar dashboard for the trace of this operation!")
except Exception as e:
print(f"An error occurred: {e}")
print("Ensure your API keys are correct and you have network connectivity.")