LangGraph API
langgraph-api provides a convenient API layer for serving LangGraph agents as RESTful or RPC endpoints. It simplifies the deployment of complex, stateful LLM agents by integrating with FastAPI and Pydantic, making it easy to expose agent functionality over HTTP. It is currently at version 0.7.98 and is part of the rapidly evolving LangChain ecosystem, implying frequent updates.
Warnings
- breaking The library is in active development (currently in `0.x.x` versions) and does not strictly adhere to semantic versioning. This means breaking changes can occur in minor or patch releases, requiring frequent updates to your code.
- gotcha By default, `langgraph-api` manages agent state per request/session. Without careful configuration (e.g., providing a custom `state_getter`), concurrent users or requests might not have isolated state or might interact with the agent's state in unexpected ways, leading to cross-talk or incorrect responses.
- gotcha The API layer itself does not provide built-in authentication or authorization mechanisms. Deploying a `langgraph-api` endpoint without external security measures can expose your agent to unauthorized access.
- gotcha `create_langgraph_api` is specifically designed to wrap `BaseAgentExecutor` or other LangGraph runnables that conform to a particular input/output schema. Attempting to wrap generic LangChain runnables or custom graphs that do not meet these expectations might lead to runtime errors or unexpected behavior.
Install
-
pip install langgraph-api -
pip install langchain-openai
Imports
- create_langgraph_api
from langgraph_api import create_langgraph_api
- LangGraphAPI
from langgraph_api import LangGraphAPI
Quickstart
import os
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_react_agent
from langchain_core.prompts import PromptTemplate
from langchain_core.tools import tool
from langgraph_api import create_langgraph_api
from fastapi import FastAPI
# Set API key for OpenAI, or any other LLM provider
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "sk-mock-key-for-test")
# 1. Define a tool
@tool
def multiply(a: int, b: int) -> int:
"""Multiply two integers together."""
return a * b
tools = [multiply]
# 2. Define an LLM
llm = ChatOpenAI(temperature=0)
# 3. Define a prompt template
prompt = PromptTemplate.from_template("""
You are a helpful assistant.
Answer the following questions as best you can.
You have access to the following tools:
{tools}
{agent_scratchpad}
""")
# 4. Create a React agent
agent = create_react_agent(llm, tools, prompt)
# 5. Create an AgentExecutor (LangGraph runnable)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
# 6. Create the LangGraph API app
app = FastAPI(title="My Agent API")
langgraph_api_app = create_langgraph_api(app, agent_executor, path="/agent")
# To run this application:
# Save the code as 'main.py' (or any other name)
# Execute from your terminal: uvicorn main:app --port 8000 --reload
# Then access via: http://localhost:8000/agent/invoke or /agent/stream_log
print("LangGraph API application created. To run, use: uvicorn your_module_name:app --port 8000")
print("Example: uvicorn main:app --port 8000 --reload")