LangChain OpenRouter Integration
raw JSON → 0.2.3 verified Sat May 09 auth: no python
An integration package connecting OpenRouter API with LangChain, providing a ChatOpenAI-compatible chat model and embeddings. Current version 0.2.3, requires Python >=3.10, <4.0.0. Depends on langchain-core (any version). Release cadence is irregular.
pip install langchain-openrouter Common errors
error ModuleNotFoundError: No module named 'langchain_openrouter' ↓
cause Package not installed or installed in wrong environment.
fix
Run
pip install langchain-openrouter in your Python environment. error OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable ↓
cause API key not provided or incorrect environment variable name.
fix
Set
OPENAI_API_KEY environment variable or pass openai_api_key='your-key' to ChatOpenAI. error ValidationError: 1 validation error for ChatOpenAI model Field required ↓
cause The `model` parameter is required but not provided.
fix
Pass the
model parameter: ChatOpenAI(model='openai/gpt-4o-mini', openai_api_key='...'). Warnings
gotcha The parameter name for the API key in `ChatOpenAI` is `openai_api_key`, not `api_key` or `openrouter_api_key`. ↓
fix Use `openai_api_key='your-key'` when initializing ChatOpenAI.
gotcha The package exports the model class under the name `ChatOpenRouter` (alias), but the correct class name is `ChatOpenAI`. Using `ChatOpenRouter` may confuse users looking for the actual class. ↓
fix Import `ChatOpenAI` from langchain_openrouter; do not use `ChatOpenRouter` in code.
deprecated The `model_name` parameter is deprecated in favor of `model`. This applies to the underlying LangChain ChatOpenAI class. ↓
fix Use `model='openai/gpt-4o-mini'` instead of `model_name='openai/gpt-4o-mini'`.
Imports
- ChatOpenAI wrong
from langchain_openrouter.chat_models import ChatOpenAIcorrectfrom langchain_openrouter import ChatOpenAI - OpenRouterEmbeddings
from langchain_openrouter import OpenRouterEmbeddings
Quickstart
from langchain_openrouter import ChatOpenAI
llm = ChatOpenRouter(model_name="openai/gpt-4o-mini")
response = llm.invoke("Hello")
print(response.content)