Guidance
raw JSON → 0.3.1 verified Tue May 12 auth: no python install: draft quickstart: stale
Microsoft-backed constrained generation framework for LLMs. Programs interleave control flow with generation via lm += gen(...) syntax. Supports regex, CFGs, JSON schema, and select() constraints using a Rust-based llguidance engine. Backends: Transformers, llama.cpp, OpenAI, Azure AI. Model objects are immutable — each += produces a copy.
pip install guidance Common errors
error ModuleNotFoundError: No module named 'guidance' ↓
cause The 'guidance' package is not installed in your Python environment, or the environment where your code is running does not have it installed.
fix
Install the 'guidance' package using pip:
pip install guidance. error AttributeError: module 'guidance' has no attribute 'llms' ↓
cause The API for accessing language models within the `guidance` library has changed. The `llms` attribute is deprecated or incorrect.
fix
Update your code to use
guidance.models.OpenAI(...) or set the default language model using guidance.llm = guidance.models.OpenAI(...). error guidance is not callable ↓
cause You are attempting to call the `guidance` module directly as a function, but it is intended to be used by calling its specific functions, constructors (e.g., `guidance.models.OpenAI`), or as a decorator (`@guidance`).
fix
Ensure you are using the correct
guidance API, such as defining a program with the @guidance decorator for functions, or initializing a model object from guidance.models. error AssertionError: The passed tokenizer does have a byte_decoder property and using a standard gpt2 byte_decoder fails! ↓
cause This error occurs when using `guidance.models.Transformers()` with a Hugging Face tokenizer that does not have the expected `byte_decoder` property, or there's an incompatibility in how `guidance` processes the tokenizer for constrained generation.
fix
Try using a different Hugging Face model known to be compatible with
guidance (check the official guidance documentation for recommended models), or ensure your transformers and guidance library versions are up to date. Warnings
breaking guidance.llms namespace removed. All pre-0.1.x code using guidance.llms.OpenAI() or guidance.llm.OpenAI() raises AttributeError. ↓
fix Use from guidance.models import OpenAI (or Transformers, LlamaCpp, AzureOpenAI).
breaking bench module removed in 0.2.x. from guidance import bench raises ImportError. ↓
fix JSON schema benchmarking moved to external repo guidance-ai/guidance-bench.
breaking @guidance decorator requires explicit stateless=True for pure grammar composition functions that do not call gen() internally. ↓
fix Annotate stateless functions as @guidance(stateless=True). Pre-0.2 cookbook examples omit this and produce incorrect behavior.
breaking llama-cpp-python version must match guidance pin (currently >=0.3.12). Mismatched versions cause AttributeError or silent failures on LlamaCpp model init. ↓
fix pip install 'llama-cpp-python>=0.3.12' explicitly. Check guidance release notes for updated pin on each upgrade.
gotcha Model objects are immutable. lm += ... does not mutate in place — it returns a new copy. Not storing the result silently discards generated output. ↓
fix Always assign: lm = lm + ... or use lm += ... inside a with block where lm is re-bound.
gotcha pip install guidance does not install any inference backend. Importing Transformers or LlamaCpp without the backend raises ImportError at runtime. ↓
fix Install the required backend separately: pip install transformers or pip install llama-cpp-python.
gotcha OpenAI and Azure backends do not support token-level constrained decoding (regex/CFG). Constraints fall back to prompt-based steering only. Full constrained generation requires a local backend. ↓
fix Use Transformers or LlamaCpp backend for hard constraints. OpenAI backend supports JSON schema via OpenAI structured outputs API only.
breaking Installing llama-cpp-python requires system-level C/C++ compilers (like gcc or clang) and CMake to be available in the environment for building its native components. Missing these tools will result in a 'Failed building wheel' error during installation. ↓
fix Ensure system-level C/C++ compilers (e.g., `apt-get install build-essential cmake` on Debian/Ubuntu, `yum install gcc gcc-c++ make cmake` on RHEL/CentOS, or Xcode Command Line Tools on macOS) are installed before attempting to `pip install llama-cpp-python`.
breaking guidance or its Rust-based dependencies (e.g., llguidance) may fail to build on Alpine Linux (musl libc) environments due to missing system libraries or toolchain incompatibilities, leading to errors like 'libgcc_s.so.1 not found' or 'symbol not found' during Rust compilation. ↓
fix Use a glibc-based Python image (e.g., Debian/Ubuntu), or for Alpine, install 'glibc-compat', 'gcc', 'g++', and 'rustup' and ensure the Rust toolchain is configured to build for musl targets.
Install
pip install guidance[transformers] pip install guidance[llamacpp] Install compatibility draft last tested: 2026-05-12
python os / libc variant status wheel install import disk
3.10 alpine (musl) guidance - - - -
3.10 alpine (musl) llamacpp - - - -
3.10 alpine (musl) transformers - - - -
3.10 slim (glibc) guidance - - 2.21s 152M
3.10 slim (glibc) llamacpp - - - -
3.10 slim (glibc) transformers - - 4.45s 276M
3.11 alpine (musl) guidance - - - -
3.11 alpine (musl) llamacpp - - - -
3.11 alpine (musl) transformers - - - -
3.11 slim (glibc) guidance - - 3.13s 162M
3.11 slim (glibc) llamacpp - - - -
3.11 slim (glibc) transformers - - 6.55s 307M
3.12 alpine (musl) guidance - - - -
3.12 alpine (musl) llamacpp - - - -
3.12 alpine (musl) transformers - - - -
3.12 slim (glibc) guidance - - 3.94s 149M
3.12 slim (glibc) llamacpp - - - -
3.12 slim (glibc) transformers - - 7.51s 291M
3.13 alpine (musl) guidance - - - -
3.13 alpine (musl) llamacpp - - - -
3.13 alpine (musl) transformers - - - -
3.13 slim (glibc) guidance - - 3.73s 148M
3.13 slim (glibc) llamacpp - - - -
3.13 slim (glibc) transformers - - 7.10s 290M
3.9 alpine (musl) guidance - - - -
3.9 alpine (musl) llamacpp - - - -
3.9 alpine (musl) transformers - - - -
3.9 slim (glibc) guidance - - 2.54s 161M
3.9 slim (glibc) llamacpp - - - -
3.9 slim (glibc) transformers - - 5.35s 286M
Imports
- models.OpenAI wrong
guidance.llms.OpenAI('gpt-4o')correctfrom guidance.models import OpenAI - Transformers wrong
from guidance import models; models.Transformers()correctfrom guidance.models import Transformers
Quickstart stale last tested: 2026-05-11
from guidance import system, user, assistant, gen
from guidance.models import Transformers
lm = Transformers('microsoft/Phi-4-mini-instruct')
with system():
lm += 'You are a helpful assistant'
with user():
lm += 'What is the capital of France?'
with assistant():
lm += gen(name='answer', max_tokens=20)
print(lm['answer'])