Autograd Gamma

0.5.0 · active · verified Sat Apr 11

autograd-gamma provides autograd-compatible approximations to the gamma and beta family of functions, enabling automatic differentiation for these special functions. The current version is 0.5.0, with a release cadence that is infrequent but active, releasing updates as new functions or fixes are integrated.

Warnings

Install

Imports

Quickstart

This example demonstrates how to use `autograd-gamma` functions within the `autograd` framework to compute gradients. It uses `autograd.numpy` for array operations, which is crucial for automatic differentiation with `autograd`.

import autograd.numpy as np
from autograd import grad
from autograd_gamma import gammainc

# Define a function using autograd-gamma's gammainc
def calculate_value(x):
    # For autograd, inputs to gammainc should be autograd.numpy arrays
    # The 'a' parameter (first argument) is the shape parameter, 'x' is the integral upper limit
    return gammainc(2.0, x) # Example: incomplete gamma function for a=2.0

# Compute the gradient of the function with respect to x
gradient_function = grad(calculate_value)

# Test with a value (must be an autograd.numpy array for gradient)
x_val = np.array(0.5)
value = calculate_value(x_val)
gradient = gradient_function(x_val)

print(f"Function value gammainc(2.0, {x_val}) = {value}")
print(f"Gradient d/dx gammainc(2.0, x) at x={x_val} = {gradient}")

view raw JSON →