FlowLLM: Simplifying LLM-based HTTP/MCP Service Development

0.2.0.10 · active · verified Thu Apr 16

FlowLLM is a Python library designed to simplify the development of LLM-based HTTP/MCP (Message Control Protocol) services. It provides a structured way to define and manage LLM workflows using `Flow` and `Step` components, allowing developers to quickly build and deploy AI-powered APIs. The library is actively maintained with frequent minor releases in its `0.2.x` series.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart defines a simple LLM chat flow using `Flow` and `Step` components, then runs it as an HTTP service using `run_flow_server`. Ensure `OPENAI_API_KEY` is set in your environment for LLM interactions.

import os
from flowllm import Flow, Step, run_flow_server
from flowllm.models import ChatInput, ChatResponse

# Define your LLM flow
class MyChatFlow(Flow):
    def __init__(self):
        super().__init__(
            name="my_chat_flow",
            version="1.0.0",
            description="A simple chat flow.",
            input_model=ChatInput,
            output_model=ChatResponse,
        )
        self.add_step(
            Step(
                name="chat_step",
                prompt="You are a helpful AI assistant. User message: {{input.message}}",
                output_key="response",
            )
        )

    def process(self, input_data: ChatInput, context: dict) -> ChatResponse:
        response_text = context["response"].choices[0].message.content
        return ChatResponse(response=response_text)

# Initialize and run the server
if __name__ == "__main__":
    # Set your OpenAI API key. In a real application, use os.environ.get for safety.
    os.environ["OPENAI_API_KEY"] = os.environ.get("OPENAI_API_KEY", "")

    if not os.environ["OPENAI_API_KEY"]:
        print("Warning: OPENAI_API_KEY not set. LLM calls may fail.")

    flow = MyChatFlow()
    print("Starting FlowLLM server on http://0.0.0.0:8000")
    run_flow_server(flow, host="0.0.0.0", port=8000)

view raw JSON →