Blackbox Optimization
The blackboxopt
Python package contains blackbox optimization algorithms with a common
interface, along with useful helpers like parallel optimization loops, analysis and
visualization tools.
Key Features
Common Interface
The blackboxopt
base classes along with the EvaluationSpecification
and Evaluation
data classes specify a unified interface for different blackbox optimization method
implementations.
In addition to these interfaces, a standard pytest compatible testsuite is available
to ensure functional compatibility of an optimizer implementation with the blackboxopt
framework.
Optimizers
Aside from random search and a Sobol sequence based space filling method, the main ones in this package are Hyperband, BOHB and a BoTorch based Bayesian optimization base implementation. BOHB is provided as a cleaner replacement of the former implementation in HpBandSter.
Optimization Loops
As part of the blackboxopt.optimization_loops
module compatible implementations for
optimization loops are avilable bot for local, serial execution as well as for
distributed optimization via dask.distributed
.
Visualizations
Interactive visualizations like objective value over time or duration for single
objective optimization, as well as an objectives pair plot with a highlighted pareto
front for multi objective optimization is available as part of the
blackboxopt.visualizations
module.
Getting Started
The following example outlines how a quadratic function can be optimized with random search in a distributed manner.
# Copyright (c) 2020 - for information on the respective copyright owner
# see the NOTICE file and/or the repository https://github.com/boschresearch/blackboxopt
#
# SPDX-License-Identifier: Apache-2.0
import parameterspace as ps
try:
import dask.distributed as dd
except ImportError:
raise ImportError(
"Unable to import Dask Distributed specific dependencies. "
+ "Make sure to install blackboxopt[dask]"
)
from blackboxopt import Evaluation, EvaluationSpecification, Objective
from blackboxopt.optimization_loops.dask_distributed import (
run_optimization_loop,
)
from blackboxopt.optimizers.random_search import RandomSearch
def evaluation_function(eval_spec: EvaluationSpecification) -> Evaluation:
return eval_spec.create_evaluation(
objectives={"loss": eval_spec.configuration["p1"] ** 2},
user_info={"weather": "sunny"},
)
if __name__ == "__main__":
space = ps.ParameterSpace()
space.add(ps.ContinuousParameter("p1", (-1.0, 1.0)))
optimizer = RandomSearch(
space,
[Objective("loss", greater_is_better=False)],
max_steps=1000,
)
evaluations = run_optimization_loop(
optimizer, evaluation_function, dd.Client(), max_evaluations=100
)
n_successes = len([e for e in evaluations if not e.all_objectives_none])
print(f"Successfully evaluated {n_successes}/{len(evaluations)}")
License
blackboxopt
is open-sourced under the Apache-2.0 license. See the LICENSE
file for details.
For a list of other open source components included in blackboxopt
, see the file
3rd-party-licenses.txt.