OpenAI Integration for LangChain
An integration package connecting OpenAI and LangChain, providing classes for chat models (`ChatOpenAI`) and embeddings (`OpenAIEmbeddings`). It is actively maintained with frequent releases, typically minor versions every few months and patch releases more frequently, as part of the broader LangChain ecosystem.
Warnings
- breaking LangChain v1 introduced a new package structure. Most OpenAI-related classes moved from the monolithic `langchain` package (e.g., `langchain.chat_models.ChatOpenAI`) to the dedicated `langchain-openai` partner package (e.g., `langchain_openai.ChatOpenAI`). Code using old import paths will break.
- gotcha Compatibility with `langchain-core` is crucial. The `langchain-openai` package frequently bumps its minimum required `langchain-core` version. Running `langchain-openai` with an outdated `langchain-core` can lead to `AttributeError`s or unexpected behavior due to schema drifts or missing fields. [cite: 1.1.12 release notes]
- breaking For Azure OpenAI, `ChatOpenAI` now supports the v1 API directly using `base_url` and `api_key` parameters since `langchain-openai>=1.0.1`. The legacy `AzureChatOpenAI` class is still available but `ChatOpenAI` is the recommended unified approach for the v1 API.
- gotcha When using `OpenAIEmbeddings` with non-OpenAI compatible APIs (e.g., OpenRouter, Ollama, vLLM via `base_url`), the default `check_embedding_ctx_length` might cause errors.
- deprecated When interacting with the OpenAI Responses API, `langchain-openai` now defaults to storing response items in message content. To restore the previous (v0) behavior, users need to explicitly opt-in.
Install
-
pip install langchain-openai
Imports
- ChatOpenAI
from langchain_openai import ChatOpenAI
- OpenAIEmbeddings
from langchain_openai import OpenAIEmbeddings
Quickstart
import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage
# Set your OpenAI API key as an environment variable (recommended)
# Or pass it directly: ChatOpenAI(openai_api_key="your_api_key")
# Ensure OPENAI_API_KEY is set in your environment
if not os.environ.get("OPENAI_API_KEY"):
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "") # Replace with actual key or prompt user
# For interactive use, you might use:
# import getpass
# os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")
# Instantiate the ChatOpenAI model
chat_model = ChatOpenAI(model="gpt-4o-mini", temperature=0.7)
# Invoke the model
response = chat_model.invoke([HumanMessage(content="Tell me a short story about a brave knight.")])
print(response.content)