LangGraph API

0.7.98 · active · verified Thu Apr 09

langgraph-api provides a convenient API layer for serving LangGraph agents as RESTful or RPC endpoints. It simplifies the deployment of complex, stateful LLM agents by integrating with FastAPI and Pydantic, making it easy to expose agent functionality over HTTP. It is currently at version 0.7.98 and is part of the rapidly evolving LangChain ecosystem, implying frequent updates.

Warnings

Install

Imports

Quickstart

Demonstrates how to create a simple LangGraph `AgentExecutor` and then wrap it using `create_langgraph_api` to expose it as a FastAPI application. This makes the agent accessible via HTTP endpoints like `/agent/invoke` and `/agent/stream_log`.

import os
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_react_agent
from langchain_core.prompts import PromptTemplate
from langchain_core.tools import tool
from langgraph_api import create_langgraph_api
from fastapi import FastAPI

# Set API key for OpenAI, or any other LLM provider
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "sk-mock-key-for-test")

# 1. Define a tool
@tool
def multiply(a: int, b: int) -> int:
    """Multiply two integers together."""
    return a * b

tools = [multiply]

# 2. Define an LLM
llm = ChatOpenAI(temperature=0)

# 3. Define a prompt template
prompt = PromptTemplate.from_template("""
You are a helpful assistant.
Answer the following questions as best you can.
You have access to the following tools:
{tools}
{agent_scratchpad}
""")

# 4. Create a React agent
agent = create_react_agent(llm, tools, prompt)

# 5. Create an AgentExecutor (LangGraph runnable)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

# 6. Create the LangGraph API app
app = FastAPI(title="My Agent API")
langgraph_api_app = create_langgraph_api(app, agent_executor, path="/agent")

# To run this application:
# Save the code as 'main.py' (or any other name)
# Execute from your terminal: uvicorn main:app --port 8000 --reload
# Then access via: http://localhost:8000/agent/invoke or /agent/stream_log

print("LangGraph API application created. To run, use: uvicorn your_module_name:app --port 8000")
print("Example: uvicorn main:app --port 8000 --reload")

view raw JSON →