hgboost’s documentation!
The Hyperoptimized Gradient Boosting library (hgboost
), is a Python package for hyperparameter optimization for XGBoost, LightBoost, and CatBoost. HGBoost will carefully split the dataset into a train, test, and an independent validation set. Within the train-test set there is the inner loop for optimizing the hyperparameters using Bayesian optimization (based on Hyperopt) and, the outer loop is to test how well the best-performing models can generalize using an external k-fold cross validation. This approach will select the most robust model with the highest performance.
hgboost
is fun because:
It contains the most popular decision trees; XGBoost, LightBoost and Catboost.
It consists Bayesian hyperparameter optimization.
It automates splitting the data set into a train-test and independent validation.
It contains a nested scheme with an inner loop for hyperparameter optimization and an outer loop with crossvalidation to determine the best model.
It handles both classification and regression tasks.
It allows multi-class and ensemble of boosted decision tree models.
It takes care of unbalanced datasets.
It creates explainable results for the hyperparameter search-space, and model performance.
It is open-source.
It is documented with many examples.
Note
Your ❤️ is important to keep maintaining this package. You can support in various ways, have a look at the sponser page. Report bugs, issues and feature extensions at github page.
pip install hgboost