OpenAI Integration for LangChain

1.1.12 · active · verified Sat Mar 28

An integration package connecting OpenAI and LangChain, providing classes for chat models (`ChatOpenAI`) and embeddings (`OpenAIEmbeddings`). It is actively maintained with frequent releases, typically minor versions every few months and patch releases more frequently, as part of the broader LangChain ecosystem.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to initialize `ChatOpenAI` and send a basic message. It assumes the `OPENAI_API_KEY` environment variable is set.

import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

# Set your OpenAI API key as an environment variable (recommended)
# Or pass it directly: ChatOpenAI(openai_api_key="your_api_key")

# Ensure OPENAI_API_KEY is set in your environment
if not os.environ.get("OPENAI_API_KEY"): 
    os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "") # Replace with actual key or prompt user
    # For interactive use, you might use: 
    # import getpass 
    # os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

# Instantiate the ChatOpenAI model
chat_model = ChatOpenAI(model="gpt-4o-mini", temperature=0.7)

# Invoke the model
response = chat_model.invoke([HumanMessage(content="Tell me a short story about a brave knight.")])

print(response.content)

view raw JSON →