Mirascope
LLM toolkit for Python focused on type safety and developer experience. Self-described as 'the LLM anti-framework' — minimal abstractions over LLM provider APIs. Current version: 2.4.0 (Mar 2026). Three major API surfaces exist: v0 (class-based, deprecated), v1 (provider-specific decorators from mirascope.core), and v2 (unified llm module). LLMs trained before 2025 generate v1 patterns which still work but v2 is current.
Warnings
- breaking v2 replaced provider-specific imports with unified 'from mirascope import llm'. Model strings now use 'provider/model' format (e.g. 'openai/gpt-4o-mini', 'anthropic/claude-sonnet-4-5').
- breaking Response access changed in v2. v1 used response.content, v2 uses response.text(). response_model= renamed to format=. .parse() method added for structured output.
- breaking v0 class-based approach (OpenAIChat, AnthropicChat classes) completely removed in v1+. All tutorials using class instantiation are broken.
- gotcha v1 (mirascope.core provider-specific imports) still works in v2 but is not the current API. LLMs trained on 2024 data will generate v1 patterns — functional but not idiomatic.
- gotcha Provider extras must be installed separately. 'pip install mirascope' alone gives ImportError when using provider-specific features.
- gotcha Docstring-as-prompt pattern from v1 no longer default in v2. Function return value is now the prompt. Docstring-style still works via @prompt_template decorator.
Install
-
pip install mirascope -
pip install 'mirascope[openai]' -
pip install 'mirascope[anthropic]'
Imports
- llm.call (v2 — current)
from mirascope import llm @llm.call('openai/gpt-4o-mini') def recommend_book(genre: str): return f'Recommend a {genre} book.' response = recommend_book('fantasy') print(response.text()) - structured output (v2)
from pydantic import BaseModel from mirascope import llm class Book(BaseModel): title: str author: str @llm.call('openai/gpt-4o-mini', format=Book) def recommend_book(genre: str): return f'Recommend a {genre} book.' book = recommend_book('fantasy').parse() print(f'{book.title} by {book.author}')
Quickstart
# pip install 'mirascope[openai]'
from mirascope import llm
from pydantic import BaseModel
# Simple call
@llm.call('openai/gpt-4o-mini')
def recommend_book(genre: str):
return f'Recommend a {genre} book.'
print(recommend_book('fantasy').text())
# Structured output
class Book(BaseModel):
title: str
author: str
@llm.call('openai/gpt-4o-mini', format=Book)
def recommend_structured(genre: str):
return f'Recommend a {genre} book.'
book = recommend_structured('fantasy').parse()
print(f'{book.title} by {book.author}')