Nevergrad

1.0.12 · active · verified Wed Apr 15

Nevergrad is a Python 3.6+ library for performing gradient-free optimization. Developed by Facebook AI Research, it provides a rich collection of optimization algorithms (evolutionary, bandit, Bayesian, etc.) and robust tools for parameter and hyperparameter tuning. It can optimize functions with continuous, discrete, or mixed variable types, even in noisy environments. The library maintains an active development status with regular releases.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to define a function with mixed continuous, discrete, and categorical parameters using `nevergrad.parametrization.Instrumentation` and then optimize it using `nevergrad.optimizers.NGOpt`. The `minimize` method returns the best parameter set found within the specified budget.

import nevergrad as ng
import numpy as np

def objective_function(learning_rate: float, batch_size: int, architecture: str) -> float:
    # Simulate a training process; optimal for lr=0.2, bs=4, arch='conv'
    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == 'conv' else 10)

# Define the parameter space using Instrumentation
parametrization = ng.p.Instrumentation(
    # Log-distributed scalar for learning_rate
    learning_rate=ng.p.Log(lower=0.001, upper=1.0),
    # Integer scalar for batch_size
    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
    # Categorical choice for architecture
    architecture=ng.p.Choice(["conv", "fc"]),
)

# Choose an optimizer (NGOpt is a recommended adaptive optimizer)
optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)

# Minimize the objective function
recommendation = optimizer.minimize(objective_function)

print(f"Optimal hyperparameters: {recommendation.kwargs}")
print(f"Best objective value: {objective_function(**recommendation.kwargs)}")

view raw JSON →