{"id":4205,"library":"pyro-ppl","title":"Pyro: A Python library for probabilistic modeling and inference","description":"Pyro is a flexible, scalable deep probabilistic programming library built on PyTorch. It enables expressive deep probabilistic modeling, unifying modern deep learning and Bayesian inference. Maintained by community contributors, including a team at the Broad Institute, Pyro is under active development with frequent releases.","status":"active","version":"1.9.1","language":"en","source_language":"en","source_url":"https://github.com/pyro-ppl/pyro","tags":["probabilistic programming","bayesian inference","deep learning","pytorch","machine learning","statistical modeling"],"install":[{"cmd":"pip install torch\npip install pyro-ppl","lang":"bash","label":"Install with pip"}],"dependencies":[{"reason":"Pyro is built on PyTorch and requires it as a backend for tensor operations and automatic differentiation. As of Pyro 1.9.0, PyTorch 1.x is no longer supported, requiring PyTorch 2.x or later.","package":"torch"}],"imports":[{"symbol":"pyro","correct":"import pyro"},{"symbol":"pyro.distributions","correct":"import pyro.distributions as dist"},{"symbol":"pyro.infer","correct":"from pyro.infer import SVI, Trace_ELBO, MCMC, NUTS"},{"symbol":"pyro.optim","correct":"from pyro.optim import Adam"}],"quickstart":{"code":"import torch\nimport pyro\nimport pyro.distributions as dist\nfrom pyro.infer import SVI, Trace_ELBO\nfrom pyro.optim import Adam\n\n# Configure PyTorch for deterministic results (optional)\ntorch.manual_seed(1);\n\n# 1. Define a probabilistic model\ndef model(data):\n    # Global parameter: probability of success 'theta' for a Bernoulli distribution\n    theta = pyro.sample(\"theta\", dist.Beta(1.0, 1.0)) # Prior for theta\n    # Observe data using pyro.plate for vectorized computation\n    with pyro.plate(\"data_loop\", len(data)):\n        pyro.sample(\"obs\", dist.Bernoulli(theta), obs=data)\n\n# 2. Define a guide (variational distribution)\ndef guide(data):\n    # Learnable parameters for the Beta distribution approximating theta\n    alpha_q = pyro.param(\"alpha_q\", torch.tensor(1.0), constraint=dist.constraints.positive)\n    beta_q = pyro.param(\"beta_q\", torch.tensor(1.0), constraint=dist.constraints.positive)\n    pyro.sample(\"theta\", dist.Beta(alpha_q, beta_q))\n\n# 3. Generate synthetic data (e.g., 8 heads, 2 tails)\ndata = torch.tensor([1.0]*8 + [0.0]*2)\n\n# 4. Set up an optimizer and SVI\noptimizer = Adam({\"lr\": 0.01})\nsvi = SVI(model, guide, optimizer, loss=Trace_ELBO())\n\n# 5. Run inference\nn_steps = 1000\nfor step in range(n_steps):\n    loss = svi.step(data)\n    if step % 100 == 0:\n        print(f\"Step {step}: Loss = {loss:.4f}\")\n\n# 6. Extract learned parameters\nalpha_q_learned = pyro.param(\"alpha_q\").item()\nbeta_q_learned = pyro.param(\"beta_q\").item()\nprint(f\"\\nLearned parameters for theta (Beta distribution): alpha_q={alpha_q_learned:.2f}, beta_q={beta_q_learned:.2f}\")\n\n# Example: Sample from the inferred posterior\nposterior_theta_samples = [guide(data).item() for _ in range(1000)]\nprint(f\"\\nMean of posterior theta samples: {torch.tensor(posterior_theta_samples).mean():.2f}\")","lang":"python","description":"This quickstart demonstrates a simple Bayesian coin-tossing model using Stochastic Variational Inference (SVI). It defines a probabilistic `model`, a variational `guide`, uses synthetic data, performs inference, and extracts the learned posterior parameters for the coin's bias."},"warnings":[{"fix":"Upgrade your PyTorch installation to version 2.x or later, and ensure your Python environment is 3.8 or newer. E.g., `pip install 'torch>=2.0.0' && pip install pyro-ppl`.","message":"Pyro 1.9.0 dropped support for PyTorch 1.x and Python 3.7. Users on older PyTorch or Python versions must upgrade to PyTorch 2.x and Python 3.8+ to use Pyro 1.9.0 and newer.","severity":"breaking","affected_versions":">=1.9.0"},{"fix":"Upgrade your Python environment to version 3.7 or newer. Python 3.8+ is recommended for recent Pyro versions.","message":"Pyro 1.8.1 dropped support for Python 3.6. Users on Python 3.6 must upgrade their Python environment to 3.7 or newer to use Pyro 1.8.1 and subsequent versions.","severity":"breaking","affected_versions":">=1.8.1"},{"fix":"Refer to the official Pyro documentation or GitHub release notes for the exact PyTorch version compatibility when encountering issues or upgrading Pyro. It is generally safe to use the latest stable PyTorch 2.x with current Pyro versions.","message":"Pyro's compatibility with PyTorch versions can be nuanced and has changed across minor releases. For example, 1.8.5 narrowly required `torch>=2.0`, while 1.8.6 re-enabled support for `torch>=1.11` before 1.9.0 definitively dropped PyTorch 1.x. Always check release notes for specific PyTorch version requirements.","severity":"gotcha","affected_versions":"*"},{"fix":"Replace `for i in range(N): pyro.sample(f'x_{i}', ...)` with `with pyro.plate('name', N): pyro.sample('x', ...)`.","message":"When defining models with conditionally independent random variables, avoid explicit Python loops and instead use `pyro.plate` for efficient, vectorized computation, especially with large datasets. Loops can be significantly slower and prevent Pyro's internal optimizations.","severity":"gotcha","affected_versions":"*"},{"fix":"Always specify a sufficient number of `warmup_steps` when initializing `pyro.infer.MCMC` (e.g., `mcmc = MCMC(kernel, num_samples=1000, warmup_steps=1000)`). Monitor convergence metrics like R-hat to assess if the warmup was adequate.","message":"Markov Chain Monte Carlo (MCMC) algorithms like NUTS (No-U-Turn Sampler) require a 'warm-up' phase. Neglecting or misconfiguring `warmup_steps` can lead to unstable chains and biased posterior samples. The warmup samples are discarded and not used for inference.","severity":"gotcha","affected_versions":"*"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}