Mirascope
LLM toolkit for Python focused on type safety and developer experience. Self-described as 'the LLM anti-framework' — minimal abstractions over LLM provider APIs. Current version: 2.4.0 (Mar 2026). Three major API surfaces exist: v0 (class-based, deprecated), v1 (provider-specific decorators from mirascope.core), and v2 (unified llm module). LLMs trained before 2025 generate v1 patterns which still work but v2 is current.
Common errors
-
llm.AuthenticationError: Invalid or missing API key
cause The required API key for the specified LLM provider (e.g., OpenAI, Anthropic, Google) is either not set in the environment variables or is incorrect.fixSet the appropriate environment variable for your provider (e.g., `export OPENAI_API_KEY='your_key_here'`, `export ANTHROPIC_API_KEY='your_key_here'`, or `export GOOGLE_API_KEY='your_key_here'`). Ensure the key is valid and has the necessary permissions. -
ModuleNotFoundError: No module named 'mirascope.core'
cause This error often occurs when trying to use older v1 API patterns (e.g., `from mirascope.core import openai`) while the current recommended API surface is `mirascope.llm` (v2), or when the `mirascope.core` module is not correctly installed/available in the environment.fixUpdate your imports to use the v2 API, typically `from mirascope import llm` and then access provider-specific functions via `llm.call(provider='openai', ...)` or `llm.openai.call(...)`. If using a specific provider's legacy decorator, ensure `mirascope[provider]` is installed and verify the correct import path for your Mirascope version. -
llm.ParseError: Failed to parse structured response
cause Mirascope failed to parse the LLM's response into the expected structured format (e.g., a Pydantic `response_model` or JSON). This often happens when the LLM's output does not conform to the schema or expected JSON format.fixReview the `response_model` or `json_mode` configuration to ensure it matches the expected output format. Adjust your prompt to guide the LLM to produce output strictly adhering to the specified schema or JSON format. You can also use `response.content` to inspect the raw output from the LLM. -
AttributeError: 'NoneType' object has no attribute 'content'
cause This error indicates that an LLM call likely failed or returned an unexpected `None` value, and subsequent code attempted to access an attribute (like `content` or `text()`) on this `None` object. This can stem from underlying `llm.Error` exceptions (e.g., `ProviderError`, `ConnectionError`, `TimeoutError`) that were not caught or handled.fixImplement robust error handling using `try...except` blocks to catch Mirascope's unified `llm.Error` exceptions (e.g., `llm.ProviderError`, `llm.ConnectionError`, `llm.TimeoutError`). Inspect the raw response object before accessing attributes, or add checks for `None` to prevent attribute access on an invalid object.
Warnings
- breaking v2 replaced provider-specific imports with unified 'from mirascope import llm'. Model strings now use 'provider/model' format (e.g. 'openai/gpt-4o-mini', 'anthropic/claude-sonnet-4-5').
- breaking Response access changed in v2. v1 used response.content, v2 uses response.text(). response_model= renamed to format=. .parse() method added for structured output.
- breaking v0 class-based approach (OpenAIChat, AnthropicChat classes) completely removed in v1+. All tutorials using class instantiation are broken.
- gotcha v1 (mirascope.core provider-specific imports) still works in v2 but is not the current API. LLMs trained on 2024 data will generate v1 patterns — functional but not idiomatic.
- gotcha Provider extras must be installed separately. 'pip install mirascope' alone gives ImportError when using provider-specific features.
- gotcha Docstring-as-prompt pattern from v1 no longer default in v2. Function return value is now the prompt. Docstring-style still works via @prompt_template decorator.
- breaking Mirascope requires API keys for providers to be set as environment variables (e.g., OPENAI_API_KEY for OpenAI, ANTHROPIC_API_KEY for Anthropic). Failing to set these will result in a `MissingAPIKeyError` at runtime.
Install
-
pip install mirascope -
pip install 'mirascope[openai]' -
pip install 'mirascope[anthropic]'
Imports
- llm.call (v2 — current)
from mirascope.core import openai @openai.call('gpt-4o-mini') def recommend_book(genre: str): """Recommend a {genre} book.""" response = recommend_book('fantasy') print(response.content)from mirascope import llm @llm.call('openai/gpt-4o-mini') def recommend_book(genre: str): return f'Recommend a {genre} book.' response = recommend_book('fantasy') print(response.text()) - structured output (v2)
from mirascope.core import openai from pydantic import BaseModel class Book(BaseModel): title: str author: str @openai.call('gpt-4o-mini', response_model=Book) def recommend_book(genre: str): """Recommend a {genre} book."""from pydantic import BaseModel from mirascope import llm class Book(BaseModel): title: str author: str @llm.call('openai/gpt-4o-mini', format=Book) def recommend_book(genre: str): return f'Recommend a {genre} book.' book = recommend_book('fantasy').parse() print(f'{book.title} by {book.author}')
Quickstart
# pip install 'mirascope[openai]'
from mirascope import llm
from pydantic import BaseModel
# Simple call
@llm.call('openai/gpt-4o-mini')
def recommend_book(genre: str):
return f'Recommend a {genre} book.'
print(recommend_book('fantasy').text())
# Structured output
class Book(BaseModel):
title: str
author: str
@llm.call('openai/gpt-4o-mini', format=Book)
def recommend_structured(genre: str):
return f'Recommend a {genre} book.'
book = recommend_structured('fantasy').parse()
print(f'{book.title} by {book.author}')