{"id":4558,"library":"gpytorch","title":"GPyTorch","description":"GPyTorch is a Gaussian Process (GP) library built on PyTorch, designed for scalable, flexible, and modular GP models. It leverages PyTorch's capabilities for GPU acceleration and automatic differentiation, making it suitable for modern machine learning workflows. GPyTorch frequently releases maintenance updates and new features, with major versions aligning with PyTorch releases.","status":"active","version":"1.15.2","language":"en","source_language":"en","source_url":"https://github.com/cornellius-gp/gpytorch/","tags":["gaussian processes","pytorch","machine learning","deep learning","probabilistic programming"],"install":[{"cmd":"pip install gpytorch","lang":"bash","label":"PyPI"},{"cmd":"conda install gpytorch -c gpytorch","lang":"bash","label":"Conda"}],"dependencies":[{"reason":"Core deep learning framework dependency, GPyTorch is built on it.","package":"torch","optional":false},{"reason":"Provides abstract base classes for linear operators used in GPyTorch's scalable inference methods.","package":"linear_operator","optional":false}],"imports":[{"symbol":"gpytorch","correct":"import gpytorch"},{"symbol":"ExactGP","correct":"from gpytorch.models import ExactGP"},{"symbol":"GaussianLikelihood","correct":"from gpytorch.likelihoods import GaussianLikelihood"},{"symbol":"ConstantMean","correct":"from gpytorch.means import ConstantMean"},{"symbol":"ScaleKernel","correct":"from gpytorch.kernels import ScaleKernel"},{"symbol":"RBFKernel","correct":"from gpytorch.kernels import RBFKernel"},{"note":"gpytorch.random_variables was deprecated and replaced by gpytorch.distributions in early versions.","wrong":"from gpytorch.random_variables import GaussianRandomVariable","symbol":"MultivariateNormal","correct":"from gpytorch.distributions import MultivariateNormal"}],"quickstart":{"code":"import math\nimport torch\nimport gpytorch\nfrom gpytorch.models import ExactGP\nfrom gpytorch.likelihoods import GaussianLikelihood\nfrom gpytorch.means import ConstantMean\nfrom gpytorch.kernels import ScaleKernel, RBFKernel\nfrom gpytorch.distributions import MultivariateNormal\nfrom torch.optim import Adam\n\n# 1. Set up training data\ntrain_x = torch.linspace(0, 1, 100)\ntrain_y = torch.sin(train_x * (2 * math.pi)) + torch.randn(train_x.size()) * math.sqrt(0.04)\n\n# 2. Define the GP model\nclass ExactGPModel(ExactGP):\n    def __init__(self, train_x, train_y, likelihood):\n        super(ExactGPModel, self).__init__(train_x, train_y, likelihood)\n        self.mean_module = ConstantMean()\n        self.covar_module = ScaleKernel(RBFKernel())\n\n    def forward(self, x):\n        mean_x = self.mean_module(x)\n        covar_x = self.covar_module(x)\n        return MultivariateNormal(mean_x, covar_x)\n\n# Initialize likelihood and model\nlikelihood = GaussianLikelihood()\nmodel = ExactGPModel(train_x, train_y, likelihood)\n\n# 3. Train the model\n# Put model and likelihood in training mode\nmodel.train()\nlikelihood.train()\n\n# Use the Adam optimizer\noptimizer = Adam(model.parameters(), lr=0.1)\n\n# \"Loss\" for GPs - the marginal log likelihood\nmll = gpytorch.mlls.ExactMarginalLogLikelihood(likelihood, model)\n\nfor i in range(50): # typically 50 training iterations\n    optimizer.zero_grad()\n    output = model(train_x)\n    loss = -mll(output, train_y)\n    loss.backward()\n    optimizer.step()\n\n# 4. Make predictions\nmodel.eval()\nlikelihood.eval()\n\nwith torch.no_grad(), gpytorch.settings.fast_pred_var():\n    test_x = torch.linspace(0, 1, 51)\n    observed_pred = likelihood(model(test_x))\n    mean = observed_pred.mean\n    lower, upper = observed_pred.confidence_region()","lang":"python","description":"This quickstart demonstrates a simple exact Gaussian Process regression. It defines a GP model, a Gaussian likelihood, trains the model using the marginal log likelihood, and then makes predictions including confidence intervals."},"warnings":[{"fix":"Ensure your Python environment is 3.10+ and PyTorch is 2.0+ before installing GPyTorch >= 1.14. You can check PyTorch compatibility at https://pytorch.org/get-started/locally/","message":"GPyTorch versions 1.14 and later require Python >= 3.10 and PyTorch >= 2.0. Attempting to install or run with older versions will lead to incompatibility issues.","severity":"breaking","affected_versions":">=1.14"},{"fix":"If you are on v1.14.1, upgrade to v1.14.2 or a newer version to avoid this specific breaking change and benefit from the fix.","message":"A temporary breaking change was introduced in v1.14.1 related to the `LinearKernel`'s `ard_num_dims` property, which was quickly reverted in v1.14.2.","severity":"breaking","affected_versions":"1.14.1"},{"fix":"Use classes from `gpytorch.distributions`, such as `gpytorch.distributions.MultivariateNormal` or `gpytorch.distributions.MultitaskMultivariateNormal`.","message":"The `gpytorch.random_variables` module and its classes (e.g., `GaussianRandomVariable`, `MultitaskGaussianRandomVariable`) were deprecated and replaced by `gpytorch.distributions`.","severity":"deprecated","affected_versions":"<0.1 (Alpha/Beta versions)"},{"fix":"Review your type-checking setup if you were explicitly using `jaxtyping` with GPyTorch. `jaxtyping` itself now supports PyTorch without a JAX dependency, so direct usage is still possible if desired.","message":"The `jaxtyping` dependency was removed in v1.15.2. Users relying on `jaxtyping` for static type checking or runtime validation with GPyTorch might notice changes in type hint behavior or require updates to their type-checking configurations.","severity":"gotcha","affected_versions":">=1.15.2"},{"fix":"Upgrade to v1.15.2 or later to ensure that `gpytorch.settings.debug.on()` functions as expected.","message":"A potential bug with `gpytorch.settings.debug.on()` was fixed in v1.15.2, meaning its behavior might have been unreliable or incorrect in prior versions.","severity":"gotcha","affected_versions":"<1.15.2"}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}