{"id":7355,"library":"langserve","title":"LangServe","description":"LangServe is a Python library that simplifies the deployment of LangChain runnables and agents as REST APIs. It builds on FastAPI to provide a robust server with a built-in playground UI for testing. The library is under active development, with version 0.3.3 being the latest, and maintains rapid compatibility updates with its core dependencies like `langchain-core`.","status":"active","version":"0.3.3","language":"en","source_language":"en","source_url":"https://github.com/langchain-ai/langserve","tags":["langchain","llm","api","fastapi","deployment","ai-agents"],"install":[{"cmd":"pip install langserve","lang":"bash","label":"Basic Installation"},{"cmd":"pip install \"langserve[server]\" \"langserve[client]\" langchain-openai","lang":"bash","label":"Full Installation (Server, Client, OpenAI LLM)"}],"dependencies":[{"reason":"Essential for defining and composing runnables. LangServe 0.3.3 requires langchain-core >=1.0.0.","package":"langchain-core","optional":false},{"reason":"Used for data validation and schema generation. LangServe 0.3.0+ requires pydantic >=2.0.0.","package":"pydantic","optional":false},{"reason":"Required to run the API server. LangServe integrates with FastAPI to expose runnables as endpoints.","package":"fastapi","optional":true},{"reason":"ASGI server required to run the FastAPI application.","package":"uvicorn","optional":true},{"reason":"Commonly used LLM provider for examples and applications; often installed alongside LangServe.","package":"langchain-openai"}],"imports":[{"note":"The `add_routes` function is specific to `langserve` for deploying runnables, not a general LangChain callback.","wrong":"from langchain.callbacks.langserve import add_routes","symbol":"add_routes","correct":"from langserve import add_routes"},{"symbol":"FastAPI","correct":"from fastapi import FastAPI"},{"symbol":"ChatOpenAI","correct":"from langchain_openai import ChatOpenAI"},{"symbol":"ChatPromptTemplate","correct":"from langchain_core.prompts import ChatPromptTemplate"},{"note":"With the modularization of LangChain, core components like Runnable are now in `langchain_core`.","wrong":"from langchain.schema.runnable import Runnable","symbol":"Runnable","correct":"from langchain_core.runnables import Runnable"}],"quickstart":{"code":"import os\nfrom fastapi import FastAPI\nfrom langchain_core.prompts import ChatPromptTemplate\nfrom langchain_openai import ChatOpenAI\nfrom langserve import add_routes\nimport uvicorn\n\n# Set your OpenAI API key from environment variable for local testing.\n# In a production environment, use proper secrets management.\nopenai_api_key = os.environ.get('OPENAI_API_KEY', 'sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')\n\napp = FastAPI(\n    title=\"LangChain Server\",\n    version=\"1.0\",\n    description=\"A simple API server using LangChain's Runnable interfaces\",\n)\n\n# A simple runnable (e.g., just an LLM)\nadd_routes(\n    app,\n    ChatOpenAI(api_key=openai_api_key),\n    path=\"/openai\",\n)\n\n# A more complex runnable (e.g., prompt + LLM)\nprompt = ChatPromptTemplate.from_template(\"Tell me a joke about {topic}\")\nmodel = ChatOpenAI(api_key=openai_api_key)\nchain = prompt | model\n\nadd_routes(\n    app,\n    chain,\n    path=\"/joke\",\n)\n\nif __name__ == \"__main__\":\n    # To run this, save it as a Python file (e.g., server.py) and execute:\n    # uvicorn server:app --reload\n    uvicorn.run(app, host=\"0.0.0.0\", port=8000)\n","lang":"python","description":"This quickstart sets up a basic LangServe API server with two endpoints: one for a raw OpenAI Chat model and another for a simple 'joke' chain (prompt + OpenAI Chat model). It uses FastAPI and uvicorn. Ensure `OPENAI_API_KEY` is set in your environment or replace the dummy key. You'll need `fastapi`, `uvicorn`, and `langchain-openai` installed."},"warnings":[{"fix":"Ensure all dependencies requiring Pydantic are compatible with V2. Upgrade your `pydantic` package to `pydantic>=2` and verify other `langchain` ecosystem packages are also updated to their latest versions. Use Pydantic's migration guide if updating your own models.","message":"LangServe v0.3.0 and later upgraded to Pydantic V2. This can cause `TypeError`, `AttributeError`, or `ValidationError` issues if other installed packages (or your own code) are still using Pydantic V1, or if you try to use V1 syntax.","severity":"breaking","affected_versions":">=0.3.0"},{"fix":"Upgrade both `langserve` and `langchain-core` to their latest compatible versions (`pip install --upgrade langserve langchain-core`). For `langserve==0.3.3`, ensure `langchain-core>=1.0.0`.","message":"LangServe v0.3.3 is explicitly compatible with `langchain-core` V1. Older `langserve` versions might not work correctly, leading to `AttributeError` or unexpected behavior when interacting with modern LangChain components.","severity":"breaking","affected_versions":"<0.3.3"},{"fix":"Install them manually using `pip install fastapi uvicorn`.","message":"While LangServe integrates with FastAPI, `fastapi` and `uvicorn` are not strict dependencies of the `langserve` package itself. Running a LangServe API server requires these to be installed separately.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always use documented public APIs, typically imported from `langserve`, `langchain_core`, or `langchain_openai`. Consult migration guides for major version bumps in the LangChain ecosystem.","message":"Prior to v0.2.0, some internal module structures or import paths might have differed due to rapid LangChain ecosystem evolution. Directly importing internal components from older LangChain or LangServe versions might lead to `ModuleNotFoundError` or `ImportError` after upgrading.","severity":"deprecated","affected_versions":"<0.2.0"},{"fix":"Upgrade to LangServe >=0.3.0 for better Pydantic V2 support, or downgrade Pydantic to V1 (`pip install pydantic==1.*`) if OpenAPI docs are critical for older LangServe versions.","message":"LangServe (especially in versions <= 0.2) might not fully support OpenAPI docs generation when Pydantic V2 is used, though API endpoints and the playground typically function as expected.","severity":"gotcha","affected_versions":"<0.3.0"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Install them manually: `pip install fastapi uvicorn`","cause":"`fastapi` and `uvicorn` are peer dependencies for running a server, but not strict requirements of the `langserve` package itself.","error":"ModuleNotFoundError: No module named 'fastapi' OR ModuleNotFoundError: No module named 'uvicorn'"},{"fix":"Ensure all dependencies requiring Pydantic are compatible with V2. Upgrade your `pydantic` package to `pydantic>=2` and verify other `langchain` ecosystem packages are also updated to their latest versions.","cause":"This indicates a Pydantic version mismatch. LangServe v0.3.0+ requires Pydantic V2, but another dependency might be forcing Pydantic V1, or your code is using V1 syntax.","error":"AttributeError: module 'pydantic' has no attribute 'json_schema' OR TypeError: 'Type[Any]' object is not subscriptable"},{"fix":"Upgrade both `langserve` and `langchain-core` to their latest compatible versions (`pip install --upgrade langserve langchain-core`). For `langserve==0.3.3`, ensure `langchain-core>=1.0.0`.","cause":"Incompatibility between `langserve` and `langchain-core` versions. The `langchain-core` API underwent significant changes (e.g., introduction of `invoke`, `batch`, `stream`).","error":"AttributeError: 'BaseChatModel' object has no attribute 'invoke' OR TypeError: 'RunnableSequence' object is not callable"},{"fix":"Install the specific integration package you need: `pip install langchain-openai` for OpenAI models.","cause":"Language model integrations are often in separate packages (e.g., `langchain-openai`, `langchain-anthropic`) and are not part of `langserve` or `langchain-core`.","error":"ModuleNotFoundError: No module named 'langchain_openai'"}]}