{"id":1803,"library":"autograd","title":"Autograd","description":"Autograd is a Python library that efficiently computes derivatives of native Python and NumPy code using automatic differentiation. The current version, 1.8.0, supports Python 3.9-3.13 and has a release cadence that includes compatibility updates for new Python and NumPy versions, along with bug fixes and minor features.","status":"active","version":"1.8.0","language":"en","source_language":"en","source_url":"https://github.com/HIPS/autograd","tags":["automatic differentiation","autodiff","gradients","numpy","machine learning"],"install":[{"cmd":"pip install autograd","lang":"bash","label":"Base installation"},{"cmd":"pip install \"autograd[scipy]\"","lang":"bash","label":"With SciPy features"}],"dependencies":[{"reason":"Core functionality relies on NumPy for array operations.","package":"numpy","optional":false},{"reason":"Required for certain advanced features.","package":"scipy","optional":true},{"reason":"Requires Python 3.9 or newer.","package":"python","optional":false}],"imports":[{"note":"Use autograd's wrapped NumPy to enable automatic differentiation.","symbol":"numpy","correct":"import autograd.numpy as np"},{"note":"The primary function to obtain the gradient of a scalar-valued function.","symbol":"grad","correct":"from autograd import grad"},{"note":"For functions that vectorize over inputs and return an array of scalar derivatives.","symbol":"elementwise_grad","correct":"from autograd import elementwise_grad as egrad"}],"quickstart":{"code":"import autograd.numpy as np\nfrom autograd import grad\n\ndef tanh(x):\n    return (1.0 - np.exp((-2 * x))) / (1.0 + np.exp(-(2 * x)))\n\ngrad_tanh = grad(tanh)\nx_val = np.float64(1.0)\n\nprint(f\"tanh({x_val}) = {tanh(x_val)}\")\nprint(f\"Gradient of tanh at {x_val} = {grad_tanh(x_val)}\")\n","lang":"python","description":"This example defines a simple tanh function using `autograd.numpy`, then uses `autograd.grad` to obtain its derivative function. The gradient is then evaluated at a specific point."},"warnings":[{"fix":"Migrate your project to Python 3.9+.","message":"Autograd v1.7.0 (and subsequent versions) dropped support for Python 2.x. Python 3.9 or newer is now required.","severity":"breaking","affected_versions":">=1.7.0"},{"fix":"Ensure your environment uses a NumPy version compatible with Autograd's constraints (currently <3.0). Check release notes for specific version requirements.","message":"While Autograd v1.7.0 added compatibility for NumPy 2.0.0, the latest v1.8.0 explicitly pins dependencies to NumPy < 3 to ensure stability. Future NumPy 3.x versions might introduce breaking changes.","severity":"breaking","affected_versions":">=1.8.0"},{"fix":"Rewrite code to use non-in-place operations and function-style calls for NumPy array manipulations.","message":"Autograd does not properly handle in-place array operations (e.g., `A[0,0] = x`, `a += b`) or method-style dot products (`A.dot(B)`). Always use non-in-place assignments (`a = a + b`) and function-style operations (`np.dot(A, B)`).","severity":"gotcha","affected_versions":"All"},{"fix":"Ensure all array creations and operations explicitly use `autograd.numpy` functions and array types.","message":"Avoid implicit casting of Python lists to NumPy arrays within functions that require differentiation (e.g., `np.sum([x, y])`). Explicitly convert lists to NumPy arrays: `np.sum(np.array([x, y]))`.","severity":"gotcha","affected_versions":"All"},{"fix":"Ensure the target function for `grad` returns a scalar, or use `elementwise_grad` for vectorized outputs.","message":"The `grad` function expects the function being differentiated to return a scalar value. If you need to differentiate a vectorized function that returns an array of scalar derivatives, use `autograd.elementwise_grad`.","severity":"gotcha","affected_versions":"All"},{"fix":"Vectorize operations as much as possible using `autograd.numpy` primitives to minimize Python-level computation. Re-evaluate algorithm design if performance remains an issue.","message":"Performance can be significantly impacted by extensive pure Python logic (e.g., complex loops, conditional statements) within the function being differentiated, as Python overheads are amplified during automatic differentiation.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-04-09T00:00:00.000Z","next_check":"2026-07-08T00:00:00.000Z"}