LangSmith
Official Python SDK for LangSmith — LangChain's observability, tracing, and evaluation platform. Instruments LLM calls, chains, agents, and arbitrary functions with the @traceable decorator or wrap_openai(). Traces are sent asynchronously to LangSmith cloud (or self-hosted). Also provides dataset management and evaluation (evaluate()) APIs. Usable standalone without LangChain. Releases multiple times per week. SECURITY NOTE: CVE-2026-25528 — SSRF via baggage header injection, fixed in 0.6.3. Any version 0.4.10–0.6.2 is vulnerable.
Warnings
- breaking CVE-2026-25528: SSRF via baggage header injection. The distributed tracing feature is vulnerable — attackers can inject arbitrary api_url values via the baggage HTTP header, causing the SDK to exfiltrate trace data to attacker-controlled endpoints. Affects versions 0.4.10–0.6.2.
- breaking LANGCHAIN_TRACING_V2 and LANGCHAIN_API_KEY are the legacy env var names. Current docs use LANGSMITH_TRACING and LANGSMITH_API_KEY. Both still work, but mixing old and new names in the same environment causes confusing precedence issues.
- breaking langsmith 0.5+ has a strict version conflict with older langchain versions. langchain 0.3.x requires langsmith<0.2.0. Installing langsmith>=0.5 in a langchain 0.3 environment causes a pip resolver conflict that breaks the entire environment.
- gotcha LANGSMITH_TRACING must be set BEFORE LangChain is imported. LangChain reads the env var at import time. Setting os.environ['LANGSMITH_TRACING'] = 'true' after 'from langchain...' has already run produces zero traces with no error.
- gotcha LANGSMITH_TRACING='true' must be a string, not a boolean. In some YAML/env file parsers, setting LANGSMITH_TRACING: true (no quotes) passes the Python boolean True, which the SDK does not recognize. Traces silently drop.
- gotcha Traces are sent asynchronously in a background thread. In short-lived scripts, the process may exit before all traces are flushed, resulting in missing or partial traces in the UI.
Install
-
pip install langsmith -
pip install 'langsmith[otel]' -
pip install 'langsmith[pytest]'
Imports
- traceable
from langsmith import traceable
- wrap_openai
from langsmith.wrappers import wrap_openai
- Client
from langsmith import Client
Quickstart
import os
os.environ['LANGSMITH_TRACING'] = 'true'
os.environ['LANGSMITH_API_KEY'] = 'ls_...'
os.environ['LANGSMITH_PROJECT'] = 'my-project'
# Option 1: Wrap OpenAI client (auto-traces all calls)
import openai
from langsmith.wrappers import wrap_openai
client = wrap_openai(openai.OpenAI())
response = client.chat.completions.create(
model='gpt-4o',
messages=[{'role': 'user', 'content': 'Hello!'}]
)
# Option 2: @traceable decorator for arbitrary functions
from langsmith import traceable
@traceable
def my_pipeline(query: str) -> str:
# any code here is traced as a span
response = client.chat.completions.create(
model='gpt-4o',
messages=[{'role': 'user', 'content': query}]
)
return response.choices[0].message.content
result = my_pipeline('What is 2+2?')
# Option 3: LangChain auto-tracing (no decorator needed)
# Just set env vars — all LangChain/LangGraph calls are traced automatically
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model='gpt-4o')
llm.invoke('Hello!') # automatically traced