OpenAI Integration for LangChain
An integration package connecting OpenAI and LangChain, providing classes for chat models (`ChatOpenAI`) and embeddings (`OpenAIEmbeddings`). It is actively maintained with frequent releases, typically minor versions every few months and patch releases more frequently, as part of the broader LangChain ecosystem.
Common errors
-
ModuleNotFoundError: No module named 'langchain_openai'
cause The `langchain-openai` integration package is not installed in your Python environment. It's a separate package and not automatically included with `langchain` itself.fixInstall the package using pip: `pip install langchain-openai` -
AuthenticationError: Incorrect API key provided: sk-...
cause The OpenAI API key is either missing, incorrect, expired, or not accessible in the environment where `langchain-openai` is trying to use it. This often happens if the `OPENAI_API_KEY` environment variable is not set or is set with an invalid key, or if the key is hardcoded incorrectly.fixEnsure your OpenAI API key is correctly set as an environment variable named `OPENAI_API_KEY` (e.g., `export OPENAI_API_KEY="your_key_here"` in your shell, or in a `.env` file loaded with `python-dotenv`). Alternatively, pass it directly to the `ChatOpenAI` or `OpenAIEmbeddings` constructor: `ChatOpenAI(api_key="your_key_here")`. -
ValidationError: 1 validation error for ChatOpenAI __root__ `openai` has no `ChatCompletion` attribute, this is likely due to an old version of the openai package. Try upgrading it with `pip install --upgrade openai`.
cause You are likely using an outdated version of the `openai` Python library (older than 1.0.0) which is incompatible with the version of `langchain-openai` you have installed. The `openai` library underwent a major refactor in version 1.0.0.fixUpgrade your `openai` package to the latest version: `pip install --upgrade openai`. Ensure `langchain-openai` is also up-to-date: `pip install --upgrade langchain-openai`. -
ImportError: cannot import name 'ChatOpenAI' from partially initialized module 'langchain_openai' (most likely due to a circular import)
cause This specific error often occurs when you have a local Python file named `openai.py` in your project directory. Python's import mechanism prioritizes local files, causing it to try and import from your `openai.py` instead of the installed `openai` package, leading to a circular import when `langchain_openai` tries to import `openai`.fixRename your local `openai.py` file to something else (e.g., `my_openai_utils.py`) to avoid conflicts with the official `openai` package.
Warnings
- breaking LangChain v1 introduced a new package structure. Most OpenAI-related classes moved from the monolithic `langchain` package (e.g., `langchain.chat_models.ChatOpenAI`) to the dedicated `langchain-openai` partner package (e.g., `langchain_openai.ChatOpenAI`). Code using old import paths will break.
- gotcha Compatibility with `langchain-core` is crucial. The `langchain-openai` package frequently bumps its minimum required `langchain-core` version. Running `langchain-openai` with an outdated `langchain-core` can lead to `AttributeError`s or unexpected behavior due to schema drifts or missing fields. [cite: 1.1.12 release notes]
- breaking For Azure OpenAI, `ChatOpenAI` now supports the v1 API directly using `base_url` and `api_key` parameters since `langchain-openai>=1.0.1`. The legacy `AzureChatOpenAI` class is still available but `ChatOpenAI` is the recommended unified approach for the v1 API.
- gotcha When using `OpenAIEmbeddings` with non-OpenAI compatible APIs (e.g., OpenRouter, Ollama, vLLM via `base_url`), the default `check_embedding_ctx_length` might cause errors.
- deprecated When interacting with the OpenAI Responses API, `langchain-openai` now defaults to storing response items in message content. To restore the previous (v0) behavior, users need to explicitly opt-in.
- breaking OpenAI API calls require an API key for authentication. This key must be provided either as an environment variable (e.g., `OPENAI_API_KEY`) or directly as an `api_key` parameter when instantiating `ChatOpenAI` or `OpenAIEmbeddings`. Failure to provide an API key results in an `AuthenticationError`.
- breaking OpenAI API calls require an API key for authentication. If an API key is not provided (either via the `OPENAI_API_KEY` environment variable or explicitly in the model constructor), an `AuthenticationError` will occur, preventing any API interaction.
Install
-
pip install langchain-openai
Imports
- ChatOpenAI
from langchain.chat_models import ChatOpenAI
from langchain_openai import ChatOpenAI
- OpenAIEmbeddings
from langchain.embeddings import OpenAIEmbeddings
from langchain_openai import OpenAIEmbeddings
Quickstart
import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage
# Set your OpenAI API key as an environment variable (recommended)
# Or pass it directly: ChatOpenAI(openai_api_key="your_api_key")
# Ensure OPENAI_API_KEY is set in your environment
if not os.environ.get("OPENAI_API_KEY"):
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "") # Replace with actual key or prompt user
# For interactive use, you might use:
# import getpass
# os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")
# Instantiate the ChatOpenAI model
chat_model = ChatOpenAI(model="gpt-4o-mini", temperature=0.7)
# Invoke the model
response = chat_model.invoke([HumanMessage(content="Tell me a short story about a brave knight.")])
print(response.content)