{"id":8722,"library":"pytorch-optimizer","title":"PyTorch Optimizer Collection","description":"pytorch-optimizer is a production-focused optimization toolkit for PyTorch, offering a comprehensive collection of over 100 optimizers, 10+ learning rate schedulers, and 10+ loss functions. It provides a consistent API for fast experimentation with modern training methods without extensive boilerplate. The library is currently at version 0.3.0 and has seen periodic updates, with the latest release in October 2021.","status":"active","version":"0.3.0","language":"en","source_language":"en","source_url":"https://github.com/jettify/pytorch-optimizer","tags":["pytorch","optimizer","deep-learning","machine-learning","ai","research"],"install":[{"cmd":"pip install pytorch-optimizer","lang":"bash","label":"Install stable version"}],"dependencies":[{"reason":"Core dependency for PyTorch optimizers.","package":"torch","optional":false},{"reason":"Required Python version.","package":"python","optional":false}],"imports":[{"note":"The PyPI package name is `pytorch-optimizer`, but the import module is `torch_optimizer`.","wrong":"from pytorch_optimizer import AdamP","symbol":"AdamP","correct":"from torch_optimizer import AdamP"},{"note":"The PyPI package name is `pytorch-optimizer`, but the import module is `torch_optimizer`.","wrong":"from pytorch_optimizer import load_optimizer","symbol":"load_optimizer","correct":"from torch_optimizer import load_optimizer"},{"note":"The PyPI package name is `pytorch-optimizer`, but the import module is `torch_optimizer`.","wrong":"from pytorch_optimizer import create_optimizer","symbol":"create_optimizer","correct":"from torch_optimizer import create_optimizer"}],"quickstart":{"code":"import torch\nimport torch.nn as nn\nfrom torch_optimizer import AdamP\n\n# Define a simple model\nclass SimpleModel(nn.Module):\n    def __init__(self):\n        super().__init__()\n        self.linear = nn.Linear(10, 1)\n\n    def forward(self, x):\n        return self.linear(x)\n\nmodel = SimpleModel()\n\n# Initialize AdamP optimizer with model parameters\n# Replace with other optimizers like A2GradExp, AdaBelief, etc.\noptimizer = AdamP(model.parameters(), lr=1e-3, betas=(0.9, 0.999), weight_decay=1e-2)\n\n# Example of a training step\ninputs = torch.randn(32, 10) # Batch of 32, 10 features\ntargets = torch.randn(32, 1) # Corresponding targets\n\noptimizer.zero_grad()\noutputs = model(inputs)\nloss = torch.nn.functional.mse_loss(outputs, targets)\nloss.backward()\noptimizer.step()\n\nprint(f\"Initial loss: {loss.item():.4f}\")","lang":"python","description":"This quickstart demonstrates how to initialize a model and apply an optimizer from the `torch_optimizer` library, using AdamP as an example. It includes a basic forward and backward pass to illustrate its use in a typical PyTorch training loop. You can directly import specific optimizers or use `load_optimizer` and `create_optimizer` for dynamic loading."},"warnings":[{"fix":"Upgrade to `pytorch-optimizer>=0.3.0` to regain RAdam, or switch to `torch.optim.RAdam` for versions where it's available in PyTorch.","message":"The RAdam optimizer was temporarily removed in version `0.2.0` due to its inclusion in PyTorch core, causing `ImportError` for existing users. It was subsequently re-added in version `0.3.0` ('Revert for Drop RAdam'). Users on `0.2.0` will not have RAdam available directly from `torch_optimizer`.","severity":"breaking","affected_versions":"0.2.0"},{"fix":"Always use `import torch_optimizer` or `from torch_optimizer import ...` for importing classes and functions.","message":"The PyPI package is named `pytorch-optimizer`, but the correct Python module to import is `torch_optimizer`. Attempting to `import pytorch_optimizer` will result in an `ModuleNotFoundError`.","severity":"gotcha","affected_versions":"All versions"},{"message":"The library author advises against selecting optimizers solely based on visualizations. Different optimizers have unique properties and may require specific learning rate schedules or tuning. It is recommended to start with built-in PyTorch optimizers like SGD or Adam to establish a baseline before experimenting with `pytorch-optimizer` variants.","severity":"gotcha"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Upgrade the package to `pip install pytorch-optimizer>=0.3.0` or downgrade to `pip install pytorch-optimizer==0.1.0`. Alternatively, use `torch.optim.RAdam` if your PyTorch version includes it.","cause":"Attempting to import RAdam while `pytorch-optimizer==0.2.0` is installed, where RAdam was temporarily removed.","error":"ImportError: cannot import name 'RAdam' from 'torch_optimizer'"},{"fix":"Change the import statement to `import torch_optimizer as optim` or `from torch_optimizer import ...`.","cause":"Incorrect module name used in the import statement. The PyPI package name (`pytorch-optimizer`) differs from the Python import module name (`torch_optimizer`).","error":"ModuleNotFoundError: No module named 'pytorch_optimizer'"},{"fix":"Ensure that `model.parameters()` is passed as the first argument when instantiating any optimizer: `optimizer = AdamP(model.parameters(), lr=0.001)`.","cause":"When initializing an optimizer directly (e.g., `AdamP(...)`), the `model.parameters()` iterable of parameters to optimize was not passed.","error":"TypeError: optimizer() missing 1 required positional argument: 'params'"}]}