Magentic
raw JSON → 0.41.1 verified Tue May 12 auth: no python install: verified quickstart: verified
Decorator-based library for seamlessly integrating LLMs as Python functions. Uses @prompt, @chatprompt, and @prompt_chain decorators to turn Python function signatures into LLM calls with typed structured output. Built on pydantic for output validation. Current version: 0.41.1 (Mar 2026). Still pre-1.0 — API may change. Default backend: OpenAI.
pip install magentic Common errors
error openai.AuthenticationError: Incorrect API key provided ↓
cause Magentic uses OpenAI as its default LLM provider, and this error occurs when the `OPENAI_API_KEY` environment variable is not set correctly, is expired, or is invalid.
fix
Ensure your OpenAI API key is correctly set as an environment variable:
export OPENAI_API_KEY='sk-your-api-key' (or set OPENAI_API_KEY=sk-your-api-key on Windows) before running your application. You can generate or verify your API key on the OpenAI platform dashboard. error ValueError: String was returned by model but not expected. You may need to update your prompt to encourage the model to return a specific type. ↓
cause This error happens when `magentic` expects the LLM to return data in a specific Python type (often a Pydantic model) as defined by your function's return type annotation, but the LLM instead returns a plain string that cannot be parsed into the expected type.
fix
Adjust your
@prompt or @chatprompt to more explicitly guide the LLM to produce output in the expected structured format. This might involve adding instructions to the prompt, providing examples in a chatprompt, or using max_retries on the decorator to allow magentic to prompt the LLM again with error feedback. error TypeError: Last message is not a function call. ↓
cause This error occurs when you attempt to call `chat.exec_function_call()` on a `Chat` object, but the most recent message from the LLM in the conversation history does not indicate a function call.
fix
Ensure that the LLM's response genuinely intends to call a function. Review your prompt to confirm it encourages function calls when appropriate, or check the
Chat object's history before attempting exec_function_call() to confirm the last message is indeed a FunctionCall type. error ModuleNotFoundError: No module named 'anthropic' ↓
cause This error arises when you configure `magentic` to use the Anthropic LLM provider (e.g., `AnthropicChatModel`), but the `anthropic` Python package is not installed in your environment.
fix
Install the
anthropic package using pip: pip install anthropic. If you installed magentic with extras, ensure you included the anthropic extra: pip install 'magentic[anthropic]'. Warnings
gotcha The function body of @prompt decorated functions is NEVER executed. Must use ... (ellipsis) as the body. Any actual code in the body is dead code that will never run. ↓
fix Always use '...' as the function body for @prompt, @chatprompt, @prompt_chain decorated functions.
gotcha When functions= is passed to @prompt, the LLM may return a FunctionCall object instead of the final result. Must call fn_call() to execute it. Use @prompt_chain to auto-resolve. ↓
fix result = describe_weather('Boston')() — note the extra () to execute the FunctionCall. Or use @prompt_chain for automatic resolution.
gotcha Default model is gpt-4o-mini (OpenAI). Requires OPENAI_API_KEY. Fails silently if not set — raises AuthenticationError at call time, not at decoration time. ↓
fix Set OPENAI_API_KEY env var. Or pass model= explicitly: @prompt('...', model=OpenaiChatModel('gpt-4o'))
gotcha Still pre-1.0 (0.41.x). API stability not guaranteed. Minor versions may introduce breaking changes. ↓
fix Pin version in production: pip install magentic==0.41.1
gotcha Anthropic and LiteLLM backends require separate extras. 'pip install magentic' alone only includes OpenAI backend. ↓
fix pip install 'magentic[anthropic]' or 'magentic[litellm]'
breaking magentic requires Python >= 3.10. Installation will fail on older Python versions. ↓
fix Upgrade Python to 3.10 or newer.
Install
pip install 'magentic[anthropic]' pip install 'magentic[litellm]' Install compatibility verified last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) anthropic - - 2.16s 55.7M
3.10 alpine (musl) litellm - - 2.39s 212.9M
3.10 alpine (musl) magentic - - 2.21s 49.1M
3.10 slim (glibc) anthropic - - 1.61s 55M
3.10 slim (glibc) litellm - - 1.81s 196M
3.10 slim (glibc) magentic - - 1.58s 48M
3.11 alpine (musl) anthropic - - 2.83s 60.2M
3.11 alpine (musl) litellm - - 3.10s 229.8M
3.11 alpine (musl) magentic - - 2.90s 53.1M
3.11 slim (glibc) anthropic - - 2.42s 59M
3.11 slim (glibc) litellm - - 2.67s 213M
3.11 slim (glibc) magentic - - 2.44s 52M
3.12 alpine (musl) anthropic - - 2.63s 51.3M
3.12 alpine (musl) litellm - - 2.85s 218.3M
3.12 alpine (musl) magentic - - 2.67s 44.3M
3.12 slim (glibc) anthropic - - 2.62s 50M
3.12 slim (glibc) litellm - - 2.98s 201M
3.12 slim (glibc) magentic - - 2.60s 44M
3.13 alpine (musl) anthropic - - 2.49s 51.0M
3.13 alpine (musl) litellm - - 2.79s 218.0M
3.13 alpine (musl) magentic - - 2.52s 44.0M
3.13 slim (glibc) anthropic - - 2.57s 50M
3.13 slim (glibc) litellm - - 2.91s 201M
3.13 slim (glibc) magentic - - 2.53s 43M
3.9 alpine (musl) anthropic - - - -
3.9 alpine (musl) litellm - - - -
3.9 alpine (musl) magentic - - - -
3.9 slim (glibc) anthropic - - - -
3.9 slim (glibc) litellm - - - -
3.9 slim (glibc) magentic - - - -
Imports
- prompt wrong
from magentic import prompt @prompt('Create a superhero named {name}.') def create_superhero(name: str) -> Superhero: return Superhero(name=name, power='Unknown') # body is ignoredcorrectfrom magentic import prompt from pydantic import BaseModel class Superhero(BaseModel): name: str power: str @prompt('Create a superhero named {name}.') def create_superhero(name: str) -> Superhero: ... # body is never executed — must be ellipsis hero = create_superhero('Garden Man') print(hero.name, hero.power) - FunctionCall wrong
result = describe_weather('Boston') print(result) # prints FunctionCall object, not the weathercorrectfrom magentic import prompt, FunctionCall def get_weather(city: str) -> str: return f'Sunny in {city}' @prompt( 'What is the weather in {city}?', functions=[get_weather] ) def describe_weather(city: str) -> FunctionCall[str]: ... # Returns a FunctionCall object — must be called to execute fn_call = describe_weather('Boston') result = fn_call() # actually calls get_weather print(result) - chatprompt wrong
from magentic import prompt @prompt('System: You are helpful.\nUser: {question}') def answer(question: str) -> str: ...correctfrom magentic import chatprompt, SystemMessage, UserMessage @chatprompt( SystemMessage('You are a helpful assistant.'), UserMessage('{question}'), ) def answer(question: str) -> str: ... print(answer('What is Python?'))
Quickstart verified last tested: 2026-04-23
# pip install magentic
from magentic import prompt
from pydantic import BaseModel
class Superhero(BaseModel):
name: str
power: str
enemies: list[str]
@prompt('Create a superhero named {name}.')
def create_superhero(name: str) -> Superhero:
... # never executed
hero = create_superhero('Garden Man')
print(hero.name) # 'Garden Man'
print(hero.power) # 'Control over plants'
print(hero.enemies) # ['Pollution Man', ...]