Bayesian Optimization and Model-Based Optimization of Expensive Black-Box Functions.
mlrMBO
Package website: mlrmbo.mlr-org.com
Model-based optimization with mlr.
Installation
We recommend to install the official release version:
install.packages("mlrMBO")
For experimental use you can install the latest development version:
remotes::install_github("mlr-org/mlrMBO")
Introduction
mlrMBO
is a highly configurable R toolbox for model-based / Bayesian optimization of black-box functions.
Features:
- EGO-type algorithms (Kriging with expected improvement) on purely numerical search spaces, see Jones et al. (1998)
- Mixed search spaces with numerical, integer, categorical and subordinate parameters
- Arbitrary parameter transformation allowing to optimize on, e.g., logscale
- Optimization of noisy objective functions
- Multi-Criteria optimization with approximated Pareto fronts
- Parallelization through multi-point batch proposals
- Parallelization on many parallel back-ends and clusters through batchtools and parallelMap
For the surrogate, mlrMBO
allows any regression learner from mlr
, including:
- Kriging aka. Gaussian processes (i.e.
DiceKriging
) - random Forests (i.e.
randomForest
) - and many more…
Various infill criteria (aka. acquisition functions) are available:
- Expected improvement (EI)
- Upper/Lower confidence bound (LCB, aka. statistical lower or upper bound)
- Augmented expected improvement (AEI)
- Expected quantile improvement (EQI)
- API for custom infill criteria
Objective functions are created with package smoof, which also offers many test functions for example runs or benchmarks.
Parameter spaces and initial designs are created with package ParamHelpers.
How to Cite
Please cite our arxiv paper (Preprint). You can get citation info via citation("mlrMBO")
or copy the following BibTex entry:
@article{mlrMBO,
title = {{{mlrMBO}}: {{A Modular Framework}} for {{Model}}-{{Based Optimization}} of {{Expensive Black}}-{{Box Functions}}},
url = {https://arxiv.org/abs/1703.03373},
shorttitle = {{{mlrMBO}}},
archivePrefix = {arXiv},
eprinttype = {arxiv},
eprint = {1703.03373},
primaryClass = {stat},
author = {Bischl, Bernd and Richter, Jakob and Bossek, Jakob and Horn, Daniel and Thomas, Janek and Lang, Michel},
date = {2017-03-09},
}
Some parts of the package were created as part of other publications. If you use these parts, please cite the relevant work appropriately:
- Multi-point proposals, including the new multi-objective infill criteria: MOI-MBO: Multiobjective Infill for Parallel Model-Based Optimization
- Multi-objective optimization: Model-Based Multi-objective Optimization: Taxonomy, Multi-Point Proposal, Toolbox and Benchmark
- Multi-objective optimization with categorical variables using the random forest as a surrogate: Multi-objective parameter configuration of machine learning algorithms using model-based optimization.