Mirascope
raw JSON → 2.4.0 verified Tue May 12 auth: no python install: verified quickstart: verified
LLM toolkit for Python focused on type safety and developer experience. Self-described as 'the LLM anti-framework' — minimal abstractions over LLM provider APIs. Current version: 2.4.0 (Mar 2026). Three major API surfaces exist: v0 (class-based, deprecated), v1 (provider-specific decorators from mirascope.core), and v2 (unified llm module). LLMs trained before 2025 generate v1 patterns which still work but v2 is current.
pip install mirascope Common errors
error llm.AuthenticationError: Invalid or missing API key ↓
cause The required API key for the specified LLM provider (e.g., OpenAI, Anthropic, Google) is either not set in the environment variables or is incorrect.
fix
Set the appropriate environment variable for your provider (e.g.,
export OPENAI_API_KEY='your_key_here', export ANTHROPIC_API_KEY='your_key_here', or export GOOGLE_API_KEY='your_key_here'). Ensure the key is valid and has the necessary permissions. error ModuleNotFoundError: No module named 'mirascope.core' ↓
cause This error often occurs when trying to use older v1 API patterns (e.g., `from mirascope.core import openai`) while the current recommended API surface is `mirascope.llm` (v2), or when the `mirascope.core` module is not correctly installed/available in the environment.
fix
Update your imports to use the v2 API, typically
from mirascope import llm and then access provider-specific functions via llm.call(provider='openai', ...) or llm.openai.call(...). If using a specific provider's legacy decorator, ensure mirascope[provider] is installed and verify the correct import path for your Mirascope version. error llm.ParseError: Failed to parse structured response ↓
cause Mirascope failed to parse the LLM's response into the expected structured format (e.g., a Pydantic `response_model` or JSON). This often happens when the LLM's output does not conform to the schema or expected JSON format.
fix
Review the
response_model or json_mode configuration to ensure it matches the expected output format. Adjust your prompt to guide the LLM to produce output strictly adhering to the specified schema or JSON format. You can also use response.content to inspect the raw output from the LLM. error AttributeError: 'NoneType' object has no attribute 'content' ↓
cause This error indicates that an LLM call likely failed or returned an unexpected `None` value, and subsequent code attempted to access an attribute (like `content` or `text()`) on this `None` object. This can stem from underlying `llm.Error` exceptions (e.g., `ProviderError`, `ConnectionError`, `TimeoutError`) that were not caught or handled.
fix
Implement robust error handling using
try...except blocks to catch Mirascope's unified llm.Error exceptions (e.g., llm.ProviderError, llm.ConnectionError, llm.TimeoutError). Inspect the raw response object before accessing attributes, or add checks for None to prevent attribute access on an invalid object. Warnings
breaking v2 replaced provider-specific imports with unified 'from mirascope import llm'. Model strings now use 'provider/model' format (e.g. 'openai/gpt-4o-mini', 'anthropic/claude-sonnet-4-5'). ↓
fix Replace 'from mirascope.core import openai; @openai.call(model)' with 'from mirascope import llm; @llm.call("openai/model")'.
breaking Response access changed in v2. v1 used response.content, v2 uses response.text(). response_model= renamed to format=. .parse() method added for structured output. ↓
fix response.text() not response.content. format=Book not response_model=Book. response.parse() for structured output.
breaking v0 class-based approach (OpenAIChat, AnthropicChat classes) completely removed in v1+. All tutorials using class instantiation are broken. ↓
fix Use decorator-based approach. See migration guide at mirascope.com/docs/mirascope/getting-started/migration
gotcha v1 (mirascope.core provider-specific imports) still works in v2 but is not the current API. LLMs trained on 2024 data will generate v1 patterns — functional but not idiomatic. ↓
fix v1 patterns work but migrate to llm module for new code.
gotcha Provider extras must be installed separately. 'pip install mirascope' alone gives ImportError when using provider-specific features. ↓
fix pip install 'mirascope[openai]' or 'mirascope[anthropic]' etc.
gotcha Docstring-as-prompt pattern from v1 no longer default in v2. Function return value is now the prompt. Docstring-style still works via @prompt_template decorator. ↓
fix Return the prompt string from the function body, or use @prompt_template decorator for docstring-style prompts.
breaking Mirascope requires API keys for providers to be set as environment variables (e.g., OPENAI_API_KEY for OpenAI, ANTHROPIC_API_KEY for Anthropic). Failing to set these will result in a `MissingAPIKeyError` at runtime. ↓
fix Ensure the relevant API key environment variable is set before running your application (e.g., `export OPENAI_API_KEY='your_key_here'`). Alternatively, for cross-provider support, set `MIRASCOPE_API_KEY`.
Install
pip install 'mirascope[openai]' pip install 'mirascope[anthropic]' Install compatibility verified last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) anthropic - - 2.48s 43.0M
3.10 alpine (musl) openai - - 2.45s 50.6M
3.10 alpine (musl) mirascope - - 0.92s 36.5M
3.10 slim (glibc) anthropic - - 1.84s 42M
3.10 slim (glibc) openai - - 1.89s 50M
3.10 slim (glibc) mirascope - - 0.68s 36M
3.11 alpine (musl) anthropic - - 2.91s 46.9M
3.11 alpine (musl) openai - - 3.04s 54.9M
3.11 alpine (musl) mirascope - - 1.14s 39.9M
3.11 slim (glibc) anthropic - - 2.52s 46M
3.11 slim (glibc) openai - - 2.56s 54M
3.11 slim (glibc) mirascope - - 0.97s 39M
3.12 alpine (musl) anthropic - - 2.99s 38.2M
3.12 alpine (musl) openai - - 3.15s 46.0M
3.12 alpine (musl) mirascope - - 1.46s 31.4M
3.12 slim (glibc) anthropic - - 2.96s 37M
3.12 slim (glibc) openai - - 3.18s 45M
3.12 slim (glibc) mirascope - - 1.45s 31M
3.13 alpine (musl) anthropic - - 2.71s 37.9M
3.13 alpine (musl) openai - - 2.94s 45.7M
3.13 alpine (musl) mirascope - - 1.27s 31.1M
3.13 slim (glibc) anthropic - - 2.70s 37M
3.13 slim (glibc) openai - - 2.91s 45M
3.13 slim (glibc) mirascope - - 1.23s 30M
3.9 alpine (musl) anthropic - - - -
3.9 alpine (musl) openai - - - -
3.9 alpine (musl) mirascope - - - -
3.9 slim (glibc) anthropic - - - -
3.9 slim (glibc) openai - - - -
3.9 slim (glibc) mirascope - - - -
Imports
- llm.call (v2 — current) wrong
from mirascope.core import openai @openai.call('gpt-4o-mini') def recommend_book(genre: str): """Recommend a {genre} book.""" response = recommend_book('fantasy') print(response.content)correctfrom mirascope import llm @llm.call('openai/gpt-4o-mini') def recommend_book(genre: str): return f'Recommend a {genre} book.' response = recommend_book('fantasy') print(response.text()) - structured output (v2) wrong
from mirascope.core import openai from pydantic import BaseModel class Book(BaseModel): title: str author: str @openai.call('gpt-4o-mini', response_model=Book) def recommend_book(genre: str): """Recommend a {genre} book."""correctfrom pydantic import BaseModel from mirascope import llm class Book(BaseModel): title: str author: str @llm.call('openai/gpt-4o-mini', format=Book) def recommend_book(genre: str): return f'Recommend a {genre} book.' book = recommend_book('fantasy').parse() print(f'{book.title} by {book.author}')
Quickstart verified last tested: 2026-04-23
# pip install 'mirascope[openai]'
from mirascope import llm
from pydantic import BaseModel
# Simple call
@llm.call('openai/gpt-4o-mini')
def recommend_book(genre: str):
return f'Recommend a {genre} book.'
print(recommend_book('fantasy').text())
# Structured output
class Book(BaseModel):
title: str
author: str
@llm.call('openai/gpt-4o-mini', format=Book)
def recommend_structured(genre: str):
return f'Recommend a {genre} book.'
book = recommend_structured('fantasy').parse()
print(f'{book.title} by {book.author}')