{"id":5975,"library":"jaxopt","title":"JAXopt","description":"JAXopt is a Python library providing hardware-accelerated, batchable, and differentiable optimizers built on JAX. It offers a wide range of solvers for convex and non-convex optimization problems, suitable for machine learning and scientific computing, including gradient descent, L-BFGS, and quadratic programming. The current version is 0.8.5, and the project maintains a frequent release cadence with bug fixes and new features.","status":"active","version":"0.8.5","language":"en","source_language":"en","source_url":"https://github.com/google/jaxopt","tags":["optimization","jax","machine-learning","deep-learning","numerical-methods","differentiable-programming"],"install":[{"cmd":"pip install jaxopt","lang":"bash","label":"Install JAXopt"}],"dependencies":[{"reason":"Core dependency for numerical computation and automatic differentiation.","package":"jax","optional":false}],"imports":[{"symbol":"GradientDescent","correct":"from jaxopt import GradientDescent"},{"symbol":"LBFGS","correct":"from jaxopt import LBFGS"},{"symbol":"OSQP","correct":"from jaxopt import OSQP"}],"quickstart":{"code":"import jax\nimport jax.numpy as jnp\nfrom jaxopt import GradientDescent\n\n# Define a quadratic function to minimize\ndef quadratic_loss(params, data):\n    X, y = data\n    return jnp.mean((jnp.dot(X, params['weights']) + params['bias'] - y)**2)\n\n# Generate some dummy data\nkey = jax.random.PRNGKey(0)\nnum_samples = 100\nnum_features = 2\ntrue_weights = jnp.array([1.0, 2.0])\ntrue_bias = 3.0\nX = jax.random.normal(key, (num_samples, num_features))\ny = jnp.dot(X, true_weights) + true_bias + 0.1 * jax.random.normal(key, (num_samples,))\ndata = (X, y)\n\n# Initialize parameters\ninit_params = {'weights': jnp.zeros(num_features), 'bias': 0.0}\n\n# Instantiate the optimizer\ngd = GradientDescent(fun=quadratic_loss, maxiter=1000, tol=1e-3)\n\n# Run the optimization\nsol = gd.run(init_params, data=data)\nprint(f\"Optimal parameters: {sol.params}\")\nprint(f\"True weights: {true_weights}, True bias: {true_bias}\")","lang":"python","description":"This example demonstrates how to use `GradientDescent` from JAXopt to minimize a simple quadratic loss function. It sets up dummy data, initializes model parameters, and then runs the optimizer to find the optimal parameters. This covers the basic workflow of defining an objective, choosing a solver, and executing it."},"warnings":[{"fix":"Upgrade your Python environment to version 3.10 or later.","message":"Support for Python 3.8 and 3.9 has been removed in recent versions (v0.8.4 and v0.8.5 respectively). Users on these Python versions must upgrade to Python 3.10 or newer.","severity":"breaking","affected_versions":">=0.8.4"},{"fix":"If your code or custom JAXopt extensions utilize `jax.pure_callback`, explicitly define the `vmap_method` argument (e.g., `vmap_method='sequential'`) to avoid future breaking changes.","message":"The usage of `jax.pure_callback` has been migrated, specifically affecting how it's handled under `vmap` when `vmap_method` is not explicitly specified. The default behavior to `vmap_method='sequential'` is deprecated, and future versions will raise `NotImplementedError` without explicit `vmap_method`.","severity":"gotcha","affected_versions":">=0.8.5"},{"fix":"Ensure you are using JAXopt version 0.8.4 or later if you are experiencing issues with PyTree-structured parameters in constrained optimization problems.","message":"Early versions of JAXopt had issues with PyTree handling in certain solvers (e.g., `prox` functions, `BoxOSQP`), leading to incorrect behavior or errors when parameters were structured as JAX PyTrees. These have been fixed in newer releases.","severity":"gotcha","affected_versions":"<0.8.4"},{"fix":"Update any code or examples referencing the 'boston' dataset to use alternative datasets, such as those recommended by scikit-learn (e.g., California housing dataset), or other suitable benchmarks.","message":"The 'boston' dataset, previously used in some examples, was removed due to ethical concerns. Its removal might affect older tutorials or user code that directly referenced it.","severity":"deprecated","affected_versions":">=0.8.3"}],"env_vars":null,"last_verified":"2026-04-14T00:00:00.000Z","next_check":"2026-07-13T00:00:00.000Z"}