LangGraph Prebuilt Agents

raw JSON →
1.0.8 verified Tue May 12 auth: no python install: stale quickstart: stale

LangGraph Prebuilt is a Python library offering high-level APIs for creating and executing LangGraph agents and tools. It simplifies building complex agentic workflows by providing pre-packaged components and architectures, reducing the need for low-level graph construction. Currently at version 1.0.8, this library is part of the broader LangGraph ecosystem, which undergoes frequent updates, with `langgraph-prebuilt` releases typically aligning with major feature additions to its high-level components. [2, 18]

pip install langgraph-prebuilt langchain-openai tavily-python
error ModuleNotFoundError: No module named 'langgraph.prebuilt'
cause This error most commonly occurs when a Python file in your project is named `langgraph.py`, which shadows the installed `langgraph` package, or when `langgraph-prebuilt` is not properly installed or detected in the environment.
fix
Rename any local file named langgraph.py to something else. Ensure langgraph and langgraph-prebuilt are installed in your active virtual environment using: pip install langgraph langgraph-prebuilt. If issues persist, try creating a fresh virtual environment.
error ImportError: cannot import name 'create_react_agent' from 'langgraph.prebuilt'
cause This indicates that the `create_react_agent` function (or other specific components) cannot be found within `langgraph.prebuilt`, often due to a version mismatch between `langgraph` and `langgraph-prebuilt`, or an outdated installation.
fix
Ensure both langgraph and langgraph-prebuilt are updated to their latest compatible versions. It is recommended to install them together, or upgrade both: pip install --upgrade langgraph langgraph-prebuilt. If the problem persists, confirm the function's availability in the specific version's documentation.
error ImportError: cannot import name 'ServerInfo' from 'langgraph.runtime'
cause `langgraph-prebuilt` imports components like `ServerInfo` from `langgraph.runtime`, and this error occurs when your installed `langgraph` version is older and does not export these required names, indicating an incompatibility.
fix
Upgrade your langgraph package to a version compatible with your langgraph-prebuilt installation, or update both packages simultaneously: pip install --upgrade langgraph langgraph-prebuilt. Pinning compatible versions together is also a solution (e.g., langgraph==1.1.x for langgraph-prebuilt==1.0.x).
error AttributeError: 'HuggingFaceEndpoint' object has no attribute 'bind_tools'
cause This error arises when using `langgraph-prebuilt` agents (like `create_react_agent`) with LLM models that do not support the `bind_tools` method, which is a required interface for tool calling within LangGraph agents.
fix
Ensure the LLM model you are using supports tool calling and implements the bind_tools method. If your current LLM does not, consider switching to a compatible model (e.g., OpenAI's Chat models, Anthropic's Claude 3, or other models with explicit tool-calling support) or integrate it via a custom wrapper that exposes the bind_tools functionality.
gotcha LangGraph is a low-level framework, while `langgraph-prebuilt` offers higher-level abstractions. Mixing low-level graph construction (e.g., `StateGraph`, `add_node`) with prebuilt agents (like `create_react_agent`) without understanding LangGraph's state management can lead to unexpected behavior or difficult-to-debug issues. Stick to one paradigm or thoroughly understand how state is managed across both. [3, 5, 6]
fix For complex customizations, consider extending or modifying the prebuilt agent's internal graph if the documentation provides a clear path, or build a custom graph from scratch using `langgraph.graph.StateGraph` for full control. For simpler cases, leverage the provided prebuilt agent parameters and hooks. Consult the official LangGraph documentation for advanced patterns. [5, 8, 9]
gotcha Prebuilt agents often rely on external services (LLMs, search tools). Failing to set required API keys as environment variables (e.g., `OPENAI_API_KEY`, `TAVILY_API_KEY`) or passing incorrect values will cause agents to fail silently or with authentication errors. [1, 6, 13]
fix Always verify that all necessary API keys are correctly set in your environment or explicitly passed to the respective LLM/tool constructors before running the agent. Test credentials independently if an agent is failing. Use a `.env` file and `python-dotenv` for local development. [1, 6]
deprecated The LangGraph ecosystem, including `langgraph-prebuilt`, is actively developed. Older patterns for agents or tool integration, while functional, might be superseded by more efficient, flexible, or recommended approaches in newer releases. [7]
fix Regularly consult the official LangChain/LangGraph documentation and quickstarts for the latest recommended practices and agent architectures. Pay attention to release notes for potential API changes or new high-level abstractions. [1, 7, 10]
gotcha The `langchain_community` package, which provides various tools and utilities (e.g., `TavilySearchResults`), is a common dependency for agents utilizing external services. If it's not explicitly installed, importing modules from it will result in a `ModuleNotFoundError`.
fix Ensure `langchain_community` is listed in your project's `requirements.txt` or explicitly installed via `pip install langchain-community` in your environment before running agents that rely on it. Verify your environment setup, especially in containerized or CI/CD contexts, to ensure all necessary dependencies are installed.
breaking A `ModuleNotFoundError` for packages like `langchain_community` indicates that required dependencies are not installed in the execution environment. Prebuilt agents and tools often rely on specific packages that must be explicitly added.
fix Ensure all necessary Python packages are installed in the environment. This typically involves running `pip install <package_name>` for each required package, or using `pip install -r requirements.txt` if a `requirements.txt` file is present. Verify that the correct virtual environment is activated if applicable.
python os / libc status wheel install import disk
3.10 alpine (musl) wheel - - 89.9M
3.10 alpine (musl) - - - -
3.10 slim (glibc) wheel 10.6s - 98M
3.10 slim (glibc) - - - -
3.11 alpine (musl) wheel - - 96.9M
3.11 alpine (musl) - - - -
3.11 slim (glibc) wheel 9.4s - 105M
3.11 slim (glibc) - - - -
3.12 alpine (musl) wheel - - 87.5M
3.12 alpine (musl) - - - -
3.12 slim (glibc) wheel 7.7s - 95M
3.12 slim (glibc) - - - -
3.13 alpine (musl) wheel - - 87.2M
3.13 alpine (musl) - - - -
3.13 slim (glibc) wheel 7.6s - 95M
3.13 slim (glibc) - - - -
3.9 alpine (musl) wheel - - 86.2M
3.9 alpine (musl) - - - -
3.9 slim (glibc) wheel 11.9s - 94M
3.9 slim (glibc) - - - -

This quickstart demonstrates how to set up and use `create_react_agent` from `langgraph-prebuilt`. It initializes an OpenAI LLM, a Tavily search tool, and then creates an agent capable of using these tools. The agent is then invoked with a query, showcasing its ability to use tools for information retrieval and simple calculations. Ensure `OPENAI_API_KEY` and `TAVILY_API_KEY` are set in your environment for the example to run. [8, 13]

import os
from langchain_openai import ChatOpenAI
from langchain_community.tools.tavily_search import TavilySearchResults
from langgraph.prebuilt import create_react_agent
from typing import Annotated, Sequence, TypedDict
from langchain_core.messages import BaseMessage

# Set API keys (replace with actual keys or set as environment variables)
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "YOUR_OPENAI_API_KEY")
os.environ["TAVILY_API_KEY"] = os.environ.get("TAVILY_API_KEY", "YOUR_TAVILY_API_KEY")

# Define the agent state (LangGraph requires a defined state)
class AgentState(TypedDict):
    messages: Annotated[Sequence[BaseMessage], lambda x, y: x + y]

# Initialize LLM and tools
llm = ChatOpenAI(model="gpt-4o-mini") # or any other tool-calling capable LLM
search_tool = TavilySearchResults(max_results=3)
tools = [search_tool]

# Create the prebuilt ReAct agent
agent_executor = create_react_agent(llm, tools=tools)

# Example usage
# Ensure OPENAI_API_KEY and TAVILY_API_KEY are set
if os.environ["OPENAI_API_KEY"] == "YOUR_OPENAI_API_KEY" or os.environ["TAVILY_API_KEY"] == "YOUR_TAVILY_API_KEY":
    print("Please set OPENAI_API_KEY and TAVILY_API_KEY environment variables or replace placeholders.")
else:
    response = agent_executor.invoke(
        {"messages": [("human", "What is the weather in London and what is 123 + 456?")]}
    )
    print(response["messages"][-1].content)