MyNixOS website logo
Description

Search Spaces for 'mlr3'.

Collection of search spaces for hyperparameter optimization in the 'mlr3' ecosystem. It features ready-to-use search spaces for many popular machine learning algorithms. The search spaces are from scientific articles and work for a wide range of data sets.

mlr3tuningspaces

Package website: release | dev

r-cmd-check CRANStatus StackOverflow Mattermost

mlr3tuningspaces is a collection of search spaces for hyperparameter optimization in the mlr3 ecosystem. It features ready-to-use search spaces for many popular machine learning algorithms. The search spaces are from scientific articles and work for a wide range of data sets. Currently, we offer tuning spaces from three publications.

PublicationLearnern Hyperparameter
Bischl et al. (2021)glmnet2
kknn3
ranger4
rpart3
svm4
xgboost8
Kuehn et al. (2018)glmnet2
kknn1
ranger8
rpart4
svm5
xgboost13
Binder, Pfisterer, and Bischl (2020)glmnet2
kknn1
ranger6
rpart4
svm4
xgboost10

Resources

There are several sections about hyperparameter optimization in the mlr3book.

  • Getting started with the book section on mlr3tuningspaces.
  • Learn about search space.

The gallery features a collection of case studies and demos about optimization.

  • Tune a classification tree with the default tuning space from Bischl et al. (2021).

Installation

Install the last release from CRAN:

install.packages("mlr3tuningspaces")

Install the development version from GitHub:

remotes::install_github("mlr-org/mlr3tuningspaces")

Example

Quick Tuning

A learner passed to the lts() function arguments the learner with the default tuning space from Bischl et al. (2021).

library(mlr3tuningspaces)

learner = lts(lrn("classif.rpart"))

# tune learner on pima data set
instance = tune(
  tnr("random_search"),
  task = tsk("pima"),
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  term_evals = 10
)

# best performing hyperparameter configuration
instance$result
##    minsplit minbucket        cp learner_param_vals  x_domain classif.ce
## 1: 1.966882  3.038246 -4.376785          <list[4]> <list[3]>  0.2265625

Tuning Search Spaces

The mlr_tuning_spaces dictionary contains all tuning spaces.

library("data.table")

# print keys and tuning spaces
as.data.table(mlr_tuning_spaces)

A key passed to the lts() function returns the TuningSpace.

tuning_space = lts("classif.rpart.rbv2")
tuning_space
## <TuningSpace:classif.rpart.rbv2>: Classification Rpart with RandomBot
##           id lower upper levels logscale
## 1:        cp 1e-04     1            TRUE
## 2:  maxdepth 1e+00    30           FALSE
## 3: minbucket 1e+00   100           FALSE
## 4:  minsplit 1e+00   100           FALSE

Get the learner with tuning space.

tuning_space$get_learner()
## <LearnerClassifRpart:classif.rpart>: Classification Tree
## * Model: -
## * Parameters: xval=0, cp=<RangeTuneToken>, maxdepth=<RangeTuneToken>,
##   minbucket=<RangeTuneToken>, minsplit=<RangeTuneToken>
## * Packages: mlr3, rpart
## * Predict Types:  [response], prob
## * Feature Types: logical, integer, numeric, factor, ordered
## * Properties: importance, missings, multiclass, selected_features, twoclass, weights

Adding New Tuning Spaces

We are looking forward to new collections of tuning spaces from peer-reviewed articles. You can suggest new tuning spaces in an issue or contribute a new collection yourself in a pull request. Take a look at an already implemented collection e.g. our default tuning spaces from Bischl et al. (2021). A TuningSpace is added to the mlr_tuning_spaces dictionary with the add_tuning_space() function. Create a tuning space for each variant of the learner e.g. for LearnerClassifRpart and LearnerRegrRpart.

vals = list(
  minsplit  = to_tune(2, 64, logscale = TRUE),
  cp        = to_tune(1e-04, 1e-1, logscale = TRUE)
)

add_tuning_space(
  id = "classif.rpart.example",
  values = vals,
  tags = c("default", "classification"),
  learner = "classif.rpart",
  label = "Classification Tree Example"
)

Choose a name that is related to the publication and adjust the documentation.

The reference is added to the bibentries.R file

bischl_2021 = bibentry("misc",
  key           = "bischl_2021",
  title         = "Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges",
  author        = "Bernd Bischl and Martin Binder and Michel Lang and Tobias Pielok and Jakob Richter and Stefan Coors and Janek Thomas and Theresa Ullmann and Marc Becker and Anne-Laure Boulesteix and Difan Deng and Marius Lindauer",
  year          = "2021",
  eprint        = "2107.05847",
  archivePrefix = "arXiv",
  primaryClass  = "stat.ML",
  url           = "https://arxiv.org/abs/2107.05847"
)

We are happy to help you with the pull request if you have any questions.

References

Binder, Martin, Florian Pfisterer, and Bernd Bischl. 2020. “Collecting Empirical Data about Hyperparameters for Data Driven AutoML.” https://www.automl.org/wp-content/uploads/2020/07/AutoML_2020_paper_63.pdf.

Bischl, Bernd, Martin Binder, Michel Lang, Tobias Pielok, Jakob Richter, Stefan Coors, Janek Thomas, et al. 2021. “Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges.” https://arxiv.org/abs/2107.05847.

Kuehn, Daniel, Philipp Probst, Janek Thomas, and Bernd Bischl. 2018. “Automatic Exploration of Machine Learning Experiments on OpenML.” https://arxiv.org/abs/1806.10961.

Metadata

Version

0.5.1

License

Unknown

Platforms (75)

    Darwin
    FreeBSD
    Genode
    GHCJS
    Linux
    MMIXware
    NetBSD
    none
    OpenBSD
    Redox
    Solaris
    WASI
    Windows
Show all
  • aarch64-darwin
  • aarch64-genode
  • aarch64-linux
  • aarch64-netbsd
  • aarch64-none
  • aarch64_be-none
  • arm-none
  • armv5tel-linux
  • armv6l-linux
  • armv6l-netbsd
  • armv6l-none
  • armv7a-darwin
  • armv7a-linux
  • armv7a-netbsd
  • armv7l-linux
  • armv7l-netbsd
  • avr-none
  • i686-cygwin
  • i686-darwin
  • i686-freebsd
  • i686-genode
  • i686-linux
  • i686-netbsd
  • i686-none
  • i686-openbsd
  • i686-windows
  • javascript-ghcjs
  • loongarch64-linux
  • m68k-linux
  • m68k-netbsd
  • m68k-none
  • microblaze-linux
  • microblaze-none
  • microblazeel-linux
  • microblazeel-none
  • mips-linux
  • mips-none
  • mips64-linux
  • mips64-none
  • mips64el-linux
  • mipsel-linux
  • mipsel-netbsd
  • mmix-mmixware
  • msp430-none
  • or1k-none
  • powerpc-netbsd
  • powerpc-none
  • powerpc64-linux
  • powerpc64le-linux
  • powerpcle-none
  • riscv32-linux
  • riscv32-netbsd
  • riscv32-none
  • riscv64-linux
  • riscv64-netbsd
  • riscv64-none
  • rx-none
  • s390-linux
  • s390-none
  • s390x-linux
  • s390x-none
  • vc4-none
  • wasm32-wasi
  • wasm64-wasi
  • x86_64-cygwin
  • x86_64-darwin
  • x86_64-freebsd
  • x86_64-genode
  • x86_64-linux
  • x86_64-netbsd
  • x86_64-none
  • x86_64-openbsd
  • x86_64-redox
  • x86_64-solaris
  • x86_64-windows