LangSmith

0.7.7 · active · verified Sat Feb 28

Official Python SDK for LangSmith — LangChain's observability, tracing, and evaluation platform. Instruments LLM calls, chains, agents, and arbitrary functions with the @traceable decorator or wrap_openai(). Traces are sent asynchronously to LangSmith cloud (or self-hosted). Also provides dataset management and evaluation (evaluate()) APIs. Usable standalone without LangChain. Releases multiple times per week. SECURITY NOTE: CVE-2026-25528 — SSRF via baggage header injection, fixed in 0.6.3. Any version 0.4.10–0.6.2 is vulnerable.

Warnings

Install

Imports

Quickstart

Env vars must be set BEFORE importing LangChain/LangSmith — they are read at import time. Setting them after import has no effect. LANGSMITH_TRACING must be the string 'true', not a boolean.

import os
os.environ['LANGSMITH_TRACING'] = 'true'
os.environ['LANGSMITH_API_KEY'] = 'ls_...'
os.environ['LANGSMITH_PROJECT'] = 'my-project'

# Option 1: Wrap OpenAI client (auto-traces all calls)
import openai
from langsmith.wrappers import wrap_openai

client = wrap_openai(openai.OpenAI())
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello!'}]
)

# Option 2: @traceable decorator for arbitrary functions
from langsmith import traceable

@traceable
def my_pipeline(query: str) -> str:
    # any code here is traced as a span
    response = client.chat.completions.create(
        model='gpt-4o',
        messages=[{'role': 'user', 'content': query}]
    )
    return response.choices[0].message.content

result = my_pipeline('What is 2+2?')

# Option 3: LangChain auto-tracing (no decorator needed)
# Just set env vars — all LangChain/LangGraph calls are traced automatically
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model='gpt-4o')
llm.invoke('Hello!')  # automatically traced

view raw JSON →