NLopt Python Bindings

2.10.0 · active · verified Sat Apr 11

NLopt is a free/open-source library providing a common interface to a variety of nonlinear optimization algorithms, encompassing both global and local, constrained and unconstrained problems. The `nlopt` Python package offers bindings to this library, enabling Python users to leverage its extensive suite of optimization routines. The current version is 2.10.0, and the project actively maintains and releases new versions, often aligning with updates to the underlying C library.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates how to set up and run a constrained minimization problem using NLopt. It defines an objective function and nonlinear inequality constraints, both capable of providing gradients. It then initializes an `nlopt.opt` object with a gradient-based algorithm (`LD_MMA`), sets the objective, bounds, and constraints, and performs the optimization. An example using a derivative-free algorithm (`LN_COBYLA`) is also included to show the common pattern for algorithms that do not utilize gradients.

import nlopt
import numpy as np

# Objective function to minimize: f(x) = sqrt(x[1])
# subject to x[1] >= (a*x[0] + b)**3 and x[1] >= 0
# for a1=2, b1=0, a2=-1, b2=1

def myfunc(x, grad):
    if grad.size > 0:
        grad[0] = 0.0
        grad[1] = 0.5 / np.sqrt(x[1])
    return np.sqrt(x[1])

def myconstraint(x, grad, a, b):
    if grad.size > 0:
        grad[0] = 3 * a * (a * x[0] + b)**2
        grad[1] = -1.0
    return (a * x[0] + b)**3 - x[1]

# Problem dimension
n = 2

# Create an optimizer object
opt = nlopt.opt(nlopt.LD_MMA, n)

# Set minimization objective
opt.set_min_objective(myfunc)

# Set bounds
opt.set_lower_bounds([-float('inf'), 0.0]) # x[1] >= 0

# Add nonlinear inequality constraints (h(x) <= 0)
# Constraint 1: x[1] >= (2*x[0])**3  =>  (2*x[0])**3 - x[1] <= 0
opt.add_inequality_constraint(lambda x, grad: myconstraint(x, grad, 2.0, 0.0), 1e-8)
# Constraint 2: x[1] >= (-1*x[0] + 1)**3 => (-1*x[0] + 1)**3 - x[1] <= 0
opt.add_inequality_constraint(lambda x, grad: myconstraint(x, grad, -1.0, 1.0), 1e-8)

# Set stopping criteria
opt.set_xtol_rel(1e-4)
opt.set_maxeval(1000)

# Initial guess
x0 = np.array([1.234, 5.678])

try:
    x_opt = opt.optimize(x0)
    minf = opt.last_optimum_value()
    result_code = opt.last_optimize_result()
    print(f"Optimized result: x = {x_opt}, f(x) = {minf}, return code: {result_code}")
except nlopt.RunTimeError as e:
    print(f"NLopt failed: {e}")

# Example of using a derivative-free algorithm
opt_df = nlopt.opt(nlopt.LN_COBYLA, n)
opt_df.set_min_objective(myfunc) # Note: grad argument will be empty for derivative-free
opt_df.set_lower_bounds([-float('inf'), 0.0])
opt_df.add_inequality_constraint(lambda x, grad: myconstraint(x, grad, 2.0, 0.0), 1e-8)
opt_df.add_inequality_constraint(lambda x, grad: myconstraint(x, grad, -1.0, 1.0), 1e-8)
opt_df.set_xtol_rel(1e-4)

try:
    x_opt_df = opt_df.optimize(x0)
    minf_df = opt_df.last_optimum_value()
    print(f"Derivative-free result: x = {x_opt_df}, f(x) = {minf_df}")
except nlopt.RunTimeError as e:
    print(f"NLopt (derivative-free) failed: {e}")

view raw JSON →