OpenRouter Python SDK
raw JSON → 0.7.11 verified Tue May 12 auth: no python install: verified quickstart: verified
Official Python SDK for the OpenRouter API — a unified gateway to 300+ LLM models across providers (OpenAI, Anthropic, Google, Meta, Mistral, etc.) via a single OpenAI-compatible endpoint. Auto-generated from OpenRouter's OpenAPI spec and updated on every API change. Fully typed with Pydantic. SDK is explicitly in beta — breaking changes can occur between minor versions without a major version bump. IMPORTANT: There are multiple competing 'openrouter' packages on PyPI (openrouter, openrouter-client, python-open-router). Only 'openrouter' (by OpenRouter) is the official SDK.
pip install openrouter Common errors
error ModuleNotFoundError: No module named 'openrouter' ↓
cause This error occurs when the official `openrouter` Python SDK is not installed or when a different, unofficial package with a similar name is installed instead.
fix
Ensure you have installed the *official*
openrouter package by OpenRouter: pip install openrouter error AuthenticationError: OpenrouterException - {"error":{"message":"No auth credentials found","code":401}} ↓
cause This typically occurs when the OpenRouter API key is missing or incorrectly configured, often due to using the wrong environment variable name (e.g., `OPENAI_API_KEY` instead of `OPENROUTER_API_KEY`) or an invalid key.
fix
Set your OpenRouter API key as an environment variable named
OPENROUTER_API_KEY. If using python-dotenv, ensure it's loaded. For direct client initialization, pass api_key explicitly.
import os
from openrouter import OpenRouter
# Ensure OPENROUTER_API_KEY is set in your environment or .env file
client = OpenRouter(api_key=os.getenv("OPENROUTER_API_KEY"))
# Or, if integrating with libraries like LiteLLM, ensure your .env has:
# OPENROUTER_API_KEY="your_api_key_here"
# OPENAI_API_BASE="https://openrouter.ai/api/v1" error pydantic_ai.exceptions.UnexpectedModelBehavior: Invalid response from OpenAI chat completions endpoint: 1 validation error for ChatCompletion choices.0.finish_reason Input should be 'stop', 'length', 'tool_calls', 'content_filter' or 'function_call' [type=literal_error, input_value='error', input_type=str] ↓
cause OpenRouter sometimes returns a `finish_reason` value (like 'error') that is not recognized or expected by the Pydantic AI's internal OpenAI-compatible schema, leading to a validation failure.
fix
This issue often requires updates within the
pydantic-ai library or handling specific OpenRouter error responses. For immediate workarounds, consider trying different models, adjusting request parameters (e.g., streaming settings), or implementing custom error parsing if directly using the OpenRouter API without strict Pydantic AI validation. Check for newer versions of pydantic-ai or openrouter that might have addressed this compatibility. A potential fix involves using OpenRouterModel from pydantic_ai.models.openrouter if available, or catching UnexpectedModelBehavior and inspecting the raw response if possible. error {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404} ↓
cause This error indicates that the specific model you selected does not support the 'tool use' (function calling) feature that your request is attempting to leverage.
fix
Choose an OpenRouter model that explicitly supports tool use. You can check the OpenRouter documentation or model listings to identify models compatible with function calling.
from openrouter import OpenRouter
client = OpenRouter(api_key=os.getenv("OPENROUTER_API_KEY"))
response = client.chat.completions.create(
model="openrouter/your-tool-use-compatible-model", # e.g., 'openai/gpt-4o' or other models listed as supporting tools
messages=[
{"role": "user", "content": "What's the weather like in Paris?"}
],
tools=[
# Your tool definitions here
]
) Warnings
breaking SDK is explicitly in beta. The PyPI page states: 'breaking changes between versions without a major version update'. The SDK jumped from 0.0.x to 0.1.x to 0.6.0 to 0.7.x with no major version bump. Each jump contained breaking API changes. ↓
fix Pin to an exact version: pip install openrouter==0.7.11. Do not use unpinned installs in production.
gotcha Multiple competing packages on PyPI share similar names: 'openrouter' (official), 'openrouter-client' (unofficial FastAPI wrapper), 'python-open-router' (unofficial async client). pip install openrouter-client or pip install python-open-router install the wrong package with different APIs. ↓
fix Only install 'openrouter' (the official SDK by OpenRouter). Verify with: pip show openrouter — author should be 'OpenRouter', license 'Apache-2.0'.
gotcha Not using OpenRouter as a context manager leaves HTTPX client connections open (file handles, sockets). In long-running services or scripts that repeatedly instantiate OpenRouter(), this causes connection pool exhaustion. ↓
fix Always use: 'with OpenRouter(api_key=...) as client:' (sync) or 'async with OpenRouter(api_key=...) as client:' (async).
gotcha OpenRouter model strings include the provider prefix: 'anthropic/claude-sonnet-4-20250514', not just 'claude-sonnet-4-20250514'. Using just the model name without provider prefix may fail or route to an unexpected provider. ↓
fix Always use 'provider/model-name' format. Browse available models at openrouter.ai/models.
gotcha The OpenAI SDK approach (setting base_url='https://openrouter.ai/api/v1') is often simpler and more stable than the native openrouter package, especially for teams already using the openai SDK. The OpenRouter API is OpenAI-compatible by design. ↓
fix For simple use cases, consider: from openai import OpenAI; client = OpenAI(api_key=OPENROUTER_API_KEY, base_url='https://openrouter.ai/api/v1'). No extra package needed.
breaking The SDK requires an API key for authentication, typically provided via the `OPENROUTER_API_KEY` environment variable. The `KeyError` indicates this environment variable was not found. ↓
fix Ensure the `OPENROUTER_API_KEY` environment variable is correctly set in your execution environment before running the application. For example: `export OPENROUTER_API_KEY='YOUR_KEY_HERE'` or pass it via your deployment system.
Install
pip install openrouter==0.7.11 Install compatibility verified last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) openrouter - - 0.57s 35.6M
3.10 alpine (musl) openrouter==0.7.11 - - 0.59s 35.1M
3.10 slim (glibc) openrouter - - 0.42s 35M
3.10 slim (glibc) openrouter==0.7.11 - - 0.44s 35M
3.11 alpine (musl) openrouter - - 0.85s 39.0M
3.11 alpine (musl) openrouter==0.7.11 - - 0.82s 38.5M
3.11 slim (glibc) openrouter - - 0.71s 39M
3.11 slim (glibc) openrouter==0.7.11 - - 0.70s 38M
3.12 alpine (musl) openrouter - - 0.95s 30.4M
3.12 alpine (musl) openrouter==0.7.11 - - 0.92s 30.0M
3.12 slim (glibc) openrouter - - 0.93s 30M
3.12 slim (glibc) openrouter==0.7.11 - - 0.93s 30M
3.13 alpine (musl) openrouter - - 0.95s 30.1M
3.13 alpine (musl) openrouter==0.7.11 - - 0.94s 29.6M
3.13 slim (glibc) openrouter - - 0.90s 30M
3.13 slim (glibc) openrouter==0.7.11 - - 0.87s 29M
3.9 alpine (musl) openrouter - - 0.57s 34.9M
3.9 alpine (musl) openrouter==0.7.11 - - 0.56s 34.5M
3.9 slim (glibc) openrouter - - 0.52s 35M
3.9 slim (glibc) openrouter==0.7.11 - - 0.51s 34M
Imports
- OpenRouter wrong
import openrouter; openrouter.client()correctfrom openrouter import OpenRouter
Quickstart verified last tested: 2026-05-12
import os
from openrouter import OpenRouter
# Sync usage — use as context manager
with OpenRouter(api_key=os.environ['OPENROUTER_API_KEY']) as client:
# Standard completion
response = client.chat.send(
model='anthropic/claude-sonnet-4-20250514',
messages=[{'role': 'user', 'content': 'Hello!'}]
)
print(response)
# With provider routing preferences
response = client.chat.send(
model='openai/gpt-4o',
messages=[{'role': 'user', 'content': 'Hello!'}],
provider={
'sort': 'price', # or 'latency', 'throughput'
'zdr': True # Zero Data Retention
}
)
# Async usage
import asyncio
from openrouter import OpenRouter
async def main():
async with OpenRouter(api_key=os.environ['OPENROUTER_API_KEY']) as client:
response = await client.chat.send_async(
model='anthropic/claude-sonnet-4-20250514',
messages=[{'role': 'user', 'content': 'Hello!'}]
)
print(response)
asyncio.run(main())
# OpenAI SDK approach (no openrouter package needed)
from openai import OpenAI
client = OpenAI(
api_key=os.environ['OPENROUTER_API_KEY'],
base_url='https://openrouter.ai/api/v1'
)
response = client.chat.completions.create(
model='anthropic/claude-sonnet-4-20250514',
messages=[{'role': 'user', 'content': 'Hello!'}]
)