LangSmith
raw JSON → 0.7.7 verified Tue May 12 auth: yes python install: verified quickstart: stale
Official Python SDK for LangSmith — LangChain's observability, tracing, and evaluation platform. Instruments LLM calls, chains, agents, and arbitrary functions with the @traceable decorator or wrap_openai(). Traces are sent asynchronously to LangSmith cloud (or self-hosted). Also provides dataset management and evaluation (evaluate()) APIs. Usable standalone without LangChain. Releases multiple times per week. SECURITY NOTE: CVE-2026-25528 — SSRF via baggage header injection, fixed in 0.6.3. Any version 0.4.10–0.6.2 is vulnerable.
pip install langsmith Common errors
error langsmith.utils.LangSmithAuthError: Authentication failed ↓
cause This error typically occurs due to an incorrect LangSmith API key, using the wrong environment variable name (e.g., LANGSMITH_API_KEY instead of LANGCHAIN_API_KEY), an invalid project name in LANGCHAIN_PROJECT, or insufficient permissions for the API key.
fix
Ensure the
LANGCHAIN_API_KEY environment variable is correctly set with your valid LangSmith API key (it should start with ls_). Verify that LANGCHAIN_PROJECT matches an existing project name in your LangSmith dashboard and that LANGCHAIN_TRACING_V2 is set to 'true'. Regenerate your API key on the LangSmith settings page if unsure. error TypeError: Failed to fetch (LangSmith Studio / langgraph dev connection issue) ↓
cause This issue arises when the LangSmith Studio (an HTTPS site) cannot connect to a local development server (HTTP localhost), primarily due to Chrome versions 142+ enforcing Private Network Access (PNA) specifications or interfering browser extensions.
fix
In Chrome, navigate to
https://smith.langchain.com, click the lock icon in the address bar, find 'Local network access', and change its setting to 'Allow'. Reload the page. Alternatively, temporarily disable browser extensions (especially AI-related ones) or run langgraph dev --tunnel and use the provided tunnel URL in Studio's 'Connect to a local server' option. error No traces appearing in LangSmith UI when using @traceable decorator ↓
cause Traces might not appear if the tracing environment variables (`LANGCHAIN_TRACING_V2` or `LANGSMITH_TRACING`) are not set to 'true', if the LangSmith API key or endpoint is incorrect, or if the Python process exits before asynchronous traces can be flushed and sent to LangSmith.
fix
Set the environment variable
LANGCHAIN_TRACING_V2=true (or LANGSMITH_TRACING=true). Double-check LANGCHAIN_API_KEY and LANGCHAIN_ENDPOINT for correctness. For short scripts or applications that exit quickly, explicitly call langsmith.flush_traces() or langchain.callbacks.tracers.wait_for_all_tracers() at the end of your script to ensure traces are sent. error ModuleNotFoundError: No module named 'langsmith' ↓
cause This indicates that the `langsmith` package, or a specific submodule it depends on, is not installed in the currently active Python environment, or it's not correctly bundled/installed in deployment environments like Docker containers or cloud platforms.
fix
Install the
langsmith package using pip install -U langsmith. If in a virtual environment, ensure it's activated. For deployment, explicitly list langsmith in your requirements.txt or pyproject.toml and verify that your deployment pipeline successfully installs all dependencies. Warnings
breaking CVE-2026-25528: SSRF via baggage header injection. The distributed tracing feature is vulnerable — attackers can inject arbitrary api_url values via the baggage HTTP header, causing the SDK to exfiltrate trace data to attacker-controlled endpoints. Affects versions 0.4.10–0.6.2. ↓
fix Upgrade immediately: pip install 'langsmith>=0.6.3'. Do not run affected versions in environments that process untrusted HTTP requests.
breaking LANGCHAIN_TRACING_V2 and LANGCHAIN_API_KEY are the legacy env var names. Current docs use LANGSMITH_TRACING and LANGSMITH_API_KEY. Both still work, but mixing old and new names in the same environment causes confusing precedence issues. ↓
fix Standardize on LANGSMITH_TRACING and LANGSMITH_API_KEY in all new code and CI configs. Remove LANGCHAIN_TRACING_V2 from environments.
breaking langsmith 0.5+ has a strict version conflict with older langchain versions. langchain 0.3.x requires langsmith<0.2.0. Installing langsmith>=0.5 in a langchain 0.3 environment causes a pip resolver conflict that breaks the entire environment. ↓
fix Upgrade langchain to the latest version before upgrading langsmith. Or pin both: langchain==0.3.x requires langsmith<0.2.
gotcha LANGSMITH_TRACING must be set BEFORE LangChain is imported. LangChain reads the env var at import time. Setting os.environ['LANGSMITH_TRACING'] = 'true' after 'from langchain...' has already run produces zero traces with no error. ↓
fix Set all LANGSMITH_* env vars at the very top of your entry point, before any LangChain/LangSmith imports. Use .env files loaded via python-dotenv before other imports.
gotcha LANGSMITH_TRACING='true' must be a string, not a boolean. In some YAML/env file parsers, setting LANGSMITH_TRACING: true (no quotes) passes the Python boolean True, which the SDK does not recognize. Traces silently drop. ↓
fix Always quote the value: LANGSMITH_TRACING='true' (shell) or LANGSMITH_TRACING: 'true' (YAML).
gotcha Traces are sent asynchronously in a background thread. In short-lived scripts, the process may exit before all traces are flushed, resulting in missing or partial traces in the UI. ↓
fix Call langsmith.utils.wait_for_all_tracers() at the end of short-lived scripts to ensure all traces are flushed before exit.
breaking The script attempts to import the 'openai' package, but it is not installed in the environment. This error occurs when your application relies on 'openai' without explicitly including it in its dependencies. ↓
fix Install the 'openai' package: `pip install openai`. Ensure all external dependencies required by your application are explicitly listed and installed.
Install
pip install 'langsmith[otel]' pip install 'langsmith[pytest]' Install compatibility verified last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) otel - - 1.49s 63.4M
3.10 alpine (musl) pytest - - 1.32s 73.6M
3.10 alpine (musl) langsmith - - 1.32s 55.1M
3.10 slim (glibc) otel - - 1.08s 71M
3.10 slim (glibc) pytest - - 0.95s 82M
3.10 slim (glibc) langsmith - - 0.94s 63M
3.11 alpine (musl) otel - - 1.96s 68.3M
3.11 alpine (musl) pytest - - 1.74s 79.7M
3.11 alpine (musl) langsmith - - 1.74s 59.2M
3.11 slim (glibc) otel - - 1.64s 76M
3.11 slim (glibc) pytest - - 1.44s 88M
3.11 slim (glibc) langsmith - - 1.44s 67M
3.12 alpine (musl) otel - - 1.89s 59.5M
3.12 alpine (musl) pytest - - 1.67s 70.6M
3.12 alpine (musl) langsmith - - 1.66s 50.5M
3.12 slim (glibc) otel - - 1.81s 68M
3.12 slim (glibc) pytest - - 1.70s 79M
3.12 slim (glibc) langsmith - - 1.69s 58M
3.13 alpine (musl) otel - - 1.81s 59.1M
3.13 alpine (musl) pytest - - 1.61s 70.4M
3.13 alpine (musl) langsmith - - 1.60s 50.2M
3.13 slim (glibc) otel - - 1.80s 67M
3.13 slim (glibc) pytest - - 1.58s 79M
3.13 slim (glibc) langsmith - - 1.63s 58M
3.9 alpine (musl) otel - - 1.21s 60.0M
3.9 alpine (musl) pytest - - 1.05s 72.1M
3.9 alpine (musl) langsmith - - 1.04s 51.8M
3.9 slim (glibc) otel - - 1.07s 68M
3.9 slim (glibc) pytest - - 0.91s 81M
3.9 slim (glibc) langsmith - - 0.95s 60M
Imports
- traceable
from langsmith import traceable - wrap_openai
from langsmith.wrappers import wrap_openai - Client
from langsmith import Client
Quickstart stale last tested: 2026-05-12
import os
os.environ['LANGSMITH_TRACING'] = 'true'
os.environ['LANGSMITH_API_KEY'] = 'ls_...'
os.environ['LANGSMITH_PROJECT'] = 'my-project'
# Option 1: Wrap OpenAI client (auto-traces all calls)
import openai
from langsmith.wrappers import wrap_openai
client = wrap_openai(openai.OpenAI())
response = client.chat.completions.create(
model='gpt-4o',
messages=[{'role': 'user', 'content': 'Hello!'}]
)
# Option 2: @traceable decorator for arbitrary functions
from langsmith import traceable
@traceable
def my_pipeline(query: str) -> str:
# any code here is traced as a span
response = client.chat.completions.create(
model='gpt-4o',
messages=[{'role': 'user', 'content': query}]
)
return response.choices[0].message.content
result = my_pipeline('What is 2+2?')
# Option 3: LangChain auto-tracing (no decorator needed)
# Just set env vars — all LangChain/LangGraph calls are traced automatically
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model='gpt-4o')
llm.invoke('Hello!') # automatically traced