LangChain Hub Client
LangChain Hub (pypi-slug: `langchainhub`) is an API client for the LangChain Hub, a centralized platform for sharing and discovering high-quality artifacts like prompts, chains, and agents for building LLM applications. It allows users to upload, browse, retrieve, and manage these components. The current version is 0.1.21. It's actively developed as part of the broader LangChain ecosystem, with releases typically tied to LangChain updates.
Warnings
- breaking With LangChain v0.3+, the 'hub' module for programmatic prompt management has been refactored. Direct imports like `from langchain import hub` or `from langchain.prompts import load_prompt` should be used, or `langchain-classic` may be required for older patterns.
- breaking The `initialize_agent()` function has been removed in LangChain v0.3. When loading agents from LangChain Hub, you should now use factory helpers like `create_react_agent` or other specific agent constructors.
- gotcha LangChain Hub artifacts (prompts, chains, agents) are referenced using 'lc://' URIs. Misspellings or incorrect paths in these URIs will result in loading errors.
Install
-
pip install langchainhub langchain
Imports
- hub
from langchain import hub
- load_prompt
from langchain.prompts import load_prompt
Quickstart
import os
from langchain import hub
from langchain_openai import ChatOpenAI
# Set your API key as an environment variable (or replace with your actual key for testing)
# os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "YOUR_API_KEY")
# Pull a prompt from the LangChain Hub
prompt = hub.pull("hwchase17/q-and-a-prompt")
# Initialize a chat model (requires langchain-openai to be installed)
llm = ChatOpenAI(openai_api_key=os.environ.get("OPENAI_API_KEY", ""))
# Create a simple chain
chain = prompt | llm
# Invoke the chain
response = chain.invoke({"question": "What is the capital of France?", "context": "Paris is the capital of France."})
print(response.content)