Description
Machine Learning Model Evaluation for 'h2o' Package.
Description
Several functions are provided that simplify using 'h2o' package. Currently, a function for extracting the AutoML model parameter is provided, alongside a function for computing F-Measure statistics at any given threshold. For more information about 'h2o' package see <https://h2o.ai/>.
README.md
h2otools
: Machine Learning Model Evaluation for 'h2o' Package
Model evaluation
There are plenty of procedures for evaluating machine learning models, many of which are not implemented in h2o
platform. This repository provides additional functions for model performance evaluation that are not implemented in h2o
.
The
bootperformance
function evaluates the model forn
number of bootstrapped samples from the testing dataset, instead of evaluating the model on the testing dataset once. Therefore, evaluating the confidence interval of the model performance.
These functions are briefly described below:
Function | Description |
---|---|
automlModelParam | for extracting model parameters from AutoML grid |
bootperformance | Bootstrap performance ealuation |
checkFrame | Checks data.frame format, which is useful before uploading it to H2O cloud |
Fmeasure | for evaluating F3 , F4 , F5 , or any beta value. h2o only provides F0.5 , F1 , and F2 |
getPerfMatrix | retrieve performance matrix for all thresholds |
kappa | Calculates kappa for all thresholds |
performance | provides performance measures (AUC, AUCPR, MCC, Kappa, etc.) using objects from h2o package |
Installation
You can install the latest stable package from CRAN:
install.packages("h2otools")