{"id":5138,"library":"botorch","title":"BoTorch: Bayesian Optimization in PyTorch","description":"BoTorch (pronounced \"bow-torch\") is a library for Bayesian Optimization research built on top of PyTorch, leveraging auto-differentiation, GPU support, and a dynamic computation graph. It provides a modular and extensible interface for composing Bayesian Optimization primitives like models, acquisition functions, and optimizers. Currently at version 0.17.2, it is under active development with frequent maintenance and feature releases.","status":"active","version":"0.17.2","language":"en","source_language":"en","source_url":"https://github.com/meta-pytorch/botorch","tags":["Bayesian Optimization","PyTorch","Machine Learning","Optimization","GPs"],"install":[{"cmd":"pip install botorch","lang":"bash","label":"Install latest stable release"}],"dependencies":[{"reason":"Requires Python >=3.11 for BoTorch v0.17.0+.","package":"python","optional":false},{"reason":"BoTorch is built on PyTorch. Requires PyTorch >=2.2 for BoTorch v0.17.0+.","package":"torch","optional":false},{"reason":"Provides state-of-the-art probabilistic models; required GPyTorch >=1.15.2 for BoTorch v0.17.1+.","package":"gpytorch","optional":false},{"reason":"Required for GPyTorch. Requires linear_operator >=0.6.1 for BoTorch v0.17.1+.","package":"linear_operator","optional":false},{"reason":"Required for some advanced probabilistic modeling features (e.g., fully Bayesian models). Requires pyro-ppl >=1.8.4.","package":"pyro-ppl","optional":false},{"reason":"Used for optimization routines.","package":"scipy","optional":false},{"reason":"A utility dependency.","package":"multiple-dispatch","optional":false},{"reason":"Often used as a higher-level platform for experiment management and simplified Bayesian Optimization interface, but not a direct dependency of the BoTorch library.","package":"Ax","optional":true}],"imports":[{"symbol":"SingleTaskGP","correct":"from botorch.models import SingleTaskGP"},{"note":"Use LogExpectedImprovement (qLogEI) for numerical stability and improved optimization performance, as qEI has known numerical issues.","wrong":"from botorch.acquisition import ExpectedImprovement","symbol":"LogExpectedImprovement","correct":"from botorch.acquisition import LogExpectedImprovement"},{"symbol":"fit_gpytorch_mll","correct":"from botorch.fit import fit_gpytorch_mll"},{"symbol":"ExactMarginalLogLikelihood","correct":"from gpytorch.mlls import ExactMarginalLogLikelihood"},{"symbol":"optimize_acqf","correct":"from botorch.optim import optimize_acqf"}],"quickstart":{"code":"import torch\nfrom botorch.models import SingleTaskGP\nfrom botorch.acquisition import LogExpectedImprovement\nfrom botorch.fit import fit_gpytorch_mll\nfrom gpytorch.mlls import ExactMarginalLogLikelihood\nfrom botorch.optim import optimize_acqf\nfrom botorch.models.transforms import Normalize, Standardize\n\n# 1. Define objective function (e.g., a simple 2D function)\ndef objective_function(x):\n    return 1 - (x - 0.5).norm(dim=-1, keepdim=True)\n\n# 2. Generate initial training data\ntrain_X = torch.rand(10, 2, dtype=torch.double) * 2\ntrain_Y = objective_function(train_X)\ntrain_Y += 0.1 * torch.randn_like(train_Y) # Add some noise\n\n# 3. Fit a Gaussian Process model\ngp = SingleTaskGP(\n    train_X=train_X,\n    train_Y=train_Y,\n    input_transform=Normalize(d=2),\n    outcome_transform=Standardize(m=1),\n)\nmll = ExactMarginalLogLikelihood(gp.likelihood, gp)\nfit_gpytorch_mll(mll)\n\n# 4. Construct an acquisition function\n# Use LogExpectedImprovement for better numerical stability\nlog_ei = LogExpectedImprovement(model=gp, best_f=train_Y.max())\n\n# 5. Optimize the acquisition function to get the next candidate\nbounds = torch.stack([torch.zeros(2), torch.ones(2)]).to(torch.double)\ncandidate, acq_value = optimize_acqf(\n    acq_function=log_ei,\n    bounds=bounds,\n    q=1,\n    num_restarts=5,\n    raw_samples=20,\n)\n\nprint(f\"Next candidate: {candidate}\")\nprint(f\"Acquisition function value at candidate: {acq_value}\")\n\n# (Optional) Evaluate the new candidate and update the model in a loop\n# new_X = candidate\n# new_Y = objective_function(new_X)\n# train_X = torch.cat([train_X, new_X])\n# train_Y = torch.cat([train_Y, new_Y])\n# ... refit model ...\n","lang":"python","description":"This quickstart demonstrates a basic Bayesian Optimization loop with BoTorch: initializing training data, fitting a Gaussian Process model, constructing an acquisition function (LogExpectedImprovement for numerical stability), and optimizing it to propose the next best candidate."},"warnings":[{"fix":"Upgrade Python to 3.11+ and PyTorch to 2.2+. Check the BoTorch `CHANGELOG.md` or official documentation for precise version requirements with each new minor release.","message":"BoTorch v0.17.0+ requires Python >=3.11 and PyTorch >=2.2. Ensure your environment meets these minimum versions before upgrading.","severity":"breaking","affected_versions":">=0.17.0"},{"fix":"Update GPyTorch to >=1.15.2 and `linear_operator` to >=0.6.1.","message":"BoTorch v0.17.1+ requires GPyTorch >=1.15.2 and `linear_operator >=0.6.1`. Older versions of these dependencies will cause compatibility issues.","severity":"breaking","affected_versions":">=0.17.1"},{"fix":"Replace `qExpectedImprovement` with `qLogExpectedImprovement` in your code. They share the same API.","message":"The `qExpectedImprovement` acquisition function has known numerical issues and is strongly recommended to be replaced by `qLogExpectedImprovement` for improved stability and performance.","severity":"deprecated","affected_versions":">=0.16.1 (warning issued in 0.16.1, affected in 0.17.x)"},{"fix":"Consult the `CHANGELOG.md` for specific replacements or alternative approaches. Plan for refactoring code that uses these removed components.","message":"Several APIs were removed in BoTorch v0.17. These include `get_fitted_map_saas_ensemble`, `qMultiObjectiveMaxValueEntropy`, `FullyBayesianPosterior`, the `task_feature` parameter from `SingleTaskGP.construct_inputs`, and the `fixed_features` argument from `optimize_acqf_homotopy`.","severity":"breaking","affected_versions":">=0.17.0"},{"fix":"Consider starting with Ax (Meta's Adaptive Experimentation platform) if you need a simpler, more managed BO workflow. If custom models or acquisition functions are needed, these can be plugged into Ax.","message":"BoTorch is a low-level API for Bayesian Optimization research. For general-purpose Bayesian Optimization and experiment management, users not actively doing research on BO are recommended to use Ax, which provides a user-friendly interface on top of BoTorch.","severity":"gotcha","affected_versions":"all"},{"fix":"Review existing models for potential changes in optimization behavior. If consistent results with older versions are critical, carefully check model initialization or pin to an earlier BoTorch version.","message":"The default hyperparameter priors for most models were updated in v0.12.0 to use dimension-scaled log-normal priors. This significantly improves robustness to dimensionality but might alter results for models fitted with older versions.","severity":"breaking","affected_versions":">=0.12.0"}],"env_vars":null,"last_verified":"2026-04-13T00:00:00.000Z","next_check":"2026-07-12T00:00:00.000Z"}