LangGraph Supervisor

0.0.31 · active · verified Mon Apr 13

LangGraph Supervisor is a Python library that simplifies building hierarchical multi-agent systems using LangGraph. It provides a central supervisor agent responsible for orchestrating specialized worker agents, managing communication flow, and delegating tasks. The library is currently in version 0.0.31 and receives frequent updates, indicating active development.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to create two specialized agents (a math expert and a research expert) and then orchestrate them using `create_supervisor`. The supervisor uses an LLM to decide which agent to hand off tasks to based on the user's input.

import os
from langchain_openai import ChatOpenAI
from langgraph_supervisor import create_supervisor
from langgraph.prebuilt import create_react_agent
from langgraph.graph import END

# Set your OpenAI API key from environment variable
os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "")

# Initialize the LLM for agents and supervisor
model = ChatOpenAI(model="gpt-4o")

# Define simple tools
def add(a: float, b: float) -> float:
    """Add two numbers."""
    return a + b

def web_search(query: str) -> str:
    """Search the web for information."""
    # Placeholder for actual web search functionality
    return f"Found results for '{query}': Example search data."

# Create specialized agents
math_agent = create_react_agent(
    model=model,
    tools=[add],
    name="math_expert",
)

research_agent = create_react_agent(
    model=model,
    tools=[web_search],
    name="research_expert",
)

# Create supervisor workflow
# The prompt parameter defines the supervisor's role and how to delegate.
workflow = create_supervisor(
    [research_agent, math_agent],
    model=model,
    prompt=(
        "You are a team supervisor managing a research expert and a math expert. "
        "For research tasks, use research_agent. "
        "For math tasks, use math_agent."
    ),
)

# To add memory and enable longer conversations, you would typically use a StateGraph and add checkpointing.
# For this quickstart, we'll compile and run a single turn.
app = workflow.compile()

# Example invocation
result = app.invoke({
    "messages": [
        {
            "role": "user",
            "content": "What is 10 + 5 and what's the capital of France?"
        }
    ]
})

print(result["messages"][-1].content)

view raw JSON →