{"id":4894,"library":"bayesian-optimization","title":"Bayesian Optimization","description":"The bayesian-optimization library provides a Python implementation of the Bayesian Optimization (BO) algorithm, specifically designed for constrained global optimization of expensive black-box functions. It leverages Bayesian inference and Gaussian processes to efficiently find the maximum value of an unknown function with minimal evaluations. The current version is 3.2.1, and the project is actively maintained with a regular release cadence, requiring Python >=3.9.","status":"active","version":"3.2.1","language":"en","source_language":"en","source_url":"https://github.com/bayesian-optimization/BayesianOptimization","tags":["optimization","bayesian","machine-learning","gaussian-processes","hyperparameter-tuning","black-box-optimization"],"install":[{"cmd":"pip install bayesian-optimization","lang":"bash","label":"Install stable version"}],"dependencies":[],"imports":[{"symbol":"BayesianOptimization","correct":"from bayes_opt import BayesianOptimization"}],"quickstart":{"code":"from bayes_opt import BayesianOptimization\nimport os\n\ndef black_box_function(x, y):\n    \"\"\"Function with unknown internals we wish to maximize.\n    This is just serving as an example, for all intents and purposes think of\n    the internals of this function, i.e.: the process which generates its\n    output values, as unknown.\n    \"\"\"\n    # Example: A simple 2D quadratic function\n    return -x ** 2 - (y - 1) ** 2 + 1\n\n# Bounded region of parameter space\npbounds = {'x': (0, 2), 'y': (0, 3)}\n\noptimizer = BayesianOptimization(\n    f=black_box_function,\n    pbounds=pbounds,\n    random_state=1,\n)\n\n# Perform 2 initial random points and 5 iterations of Bayesian Optimization\noptimizer.maximize(\n    init_points=2,\n    n_iter=5,\n)\n\nprint(f\"Best parameters found: {optimizer.max['params']}\")\nprint(f\"Maximum value found: {optimizer.max['target']}\")","lang":"python","description":"This example demonstrates how to set up and run a basic Bayesian Optimization to maximize a simple two-dimensional black-box function. It defines the objective function, specifies the search space bounds, initializes the optimizer, and then runs the maximization process for a set number of iterations."},"warnings":[{"fix":"Modify your objective function to return the negative of the value you wish to minimize. For example, `def objective_to_minimize(x): return -my_function(x)`.","message":"The `BayesianOptimization` class is designed to find the *maximum* of the objective function by default. If your goal is to *minimize* a function `f(x)`, you should define your objective function to return `-f(x)`.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure that every parameter in your objective function has a corresponding `(lower_bound, upper_bound)` tuple defined in the `pbounds` dictionary passed to `BayesianOptimization`.","message":"This is a constrained optimization technique, meaning you must provide explicit upper and lower bounds for all parameters in the search space (`pbounds`). Failing to define bounds for any parameter will result in an error or incorrect behavior.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Consider the computational cost of your objective function. If evaluations are very fast, simpler optimization strategies might be more appropriate or faster. Bayesian Optimization shines when minimizing the number of evaluations is critical.","message":"Bayesian Optimization is most effective for computationally *expensive* black-box functions where each evaluation takes a significant amount of time (minutes or hours). Using it for very cheap, quickly computable functions may be less efficient than simpler optimization methods like grid search or random search due to the overhead of fitting the surrogate model.","severity":"gotcha","affected_versions":"All versions"},{"fix":"For higher-dimensional problems, consider dimensionality reduction techniques, feature selection, or specialized Bayesian Optimization variants designed for high-dimensional spaces (e.g., those using random embeddings or sparse structures).","message":"Standard Bayesian Optimization techniques, particularly those relying on Gaussian Processes, can struggle with high-dimensional search spaces. Performance tends to degrade significantly for problems with more than approximately 20 dimensions due to the 'curse of dimensionality' affecting the surrogate model.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}