paramz

raw JSON →
0.9.6 verified Mon Apr 27 auth: no python maintenance

paramz is a parameterization framework for Python, providing mutable, constrained parameters with optimization capabilities (e.g., scipy-based L-BFGS-B). Current version is 0.9.6; release cadence is irregular with no recent updates since 2020.

pip install paramz
error AttributeError: 'Model' object has no attribute '_gradient_array_'
cause The optimizer expects a _gradient_array_ method; this is missing if you didn't call super().__init__() or if parameter names are inconsistent.
fix
Ensure that Model.__init__() is called via super().__init__() and that all parameters are added before optimization.
error Param object has no attribute 'values'
cause Param was not properly initialized or the object was overwritten.
fix
Check that Param() is called with correct arguments, e.g., Param('name', np.array([1.0])).
deprecated paramz is in maintenance mode; consider switching to GPy's paramz (built-in) or Pyro's parameterization for active development.
fix Migrate to GPyTorch or Pyro.
gotcha Param objects have their values stored as a NumPy array; setting values directly via assignment (e.g., param.values = new_array) may not trigger constraints properly. Use param[:] = new_array instead.
fix Always assign using slice syntax: param[:] = new_values.
gotcha When using gradient-based optimization, ensure that the model's log_likelihood() and gradients() methods are implemented; otherwise the optimizer may fail silently.
fix Define log_likelihood() returning the negative log-likelihood and optionally gradients() for custom gradient computation.

Create a model with a parameter and display its values.

from paramz import Param, Model
import numpy as np

class MyModel(Model):
    def __init__(self):
        super().__init__()
        self.param = Param('x', np.array([1.0, 2.0]))

m = MyModel()
print(m)
print(m.param.values)