Olivier Grisel - Histogram-based Gradient Boosting in scikit-learn 0.21
"Histogram-based Gradient Boosting in scikit-learn 0.21
[EuroPython 2019 - Talk - 2019-07-10 - MongoDB [PyData track]
[Basel, CH]
By Olivier Grisel
scikit-learn 0.21 was recently released and this presentation will give an overview its main new features in general and present the new implementation of Gradient Boosted Trees.
Gradient Boosted Trees (also known as Gradient Boosting Machines) are very competitive supervised machine learning models especially on tabular data.
Scikit-learn offered a traditional implementation of this family of methods for many years. However its computational performance was no longer competitive and was dramatically dominated by specialized state of the art libraries such as XGBoost and LightGBM. The new implementation in version 0.21 uses histograms of binned features to evaluate the tree node spit candidates. This implementation can efficiently leverage multi-core CPUs and is competitive with XGBoost and LightGBM.
We will also introduce pygbm, a numba-based implementation of gradient boosted trees that was used as prototype for the scikit-learn implementation and compare the numba vs cython developer experience.
License: This video is licensed under the CC BY-NC-SA 3.0 license: https://creativecommons.org/licenses/by-nc-sa/3.0/
Please see our speaker release agreement for details: https://ep2019.europython.eu/events/speaker-release-agreement/
Видео Olivier Grisel - Histogram-based Gradient Boosting in scikit-learn 0.21 канала EuroPython Conference
[EuroPython 2019 - Talk - 2019-07-10 - MongoDB [PyData track]
[Basel, CH]
By Olivier Grisel
scikit-learn 0.21 was recently released and this presentation will give an overview its main new features in general and present the new implementation of Gradient Boosted Trees.
Gradient Boosted Trees (also known as Gradient Boosting Machines) are very competitive supervised machine learning models especially on tabular data.
Scikit-learn offered a traditional implementation of this family of methods for many years. However its computational performance was no longer competitive and was dramatically dominated by specialized state of the art libraries such as XGBoost and LightGBM. The new implementation in version 0.21 uses histograms of binned features to evaluate the tree node spit candidates. This implementation can efficiently leverage multi-core CPUs and is competitive with XGBoost and LightGBM.
We will also introduce pygbm, a numba-based implementation of gradient boosted trees that was used as prototype for the scikit-learn implementation and compare the numba vs cython developer experience.
License: This video is licensed under the CC BY-NC-SA 3.0 license: https://creativecommons.org/licenses/by-nc-sa/3.0/
Please see our speaker release agreement for details: https://ep2019.europython.eu/events/speaker-release-agreement/
Видео Olivier Grisel - Histogram-based Gradient Boosting in scikit-learn 0.21 канала EuroPython Conference
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![SAS Tutorial | Gradient Boosting Explained](https://i.ytimg.com/vi/9wDoiSo8trc/default.jpg)
![Gradient Boosting In Depth Intuition- Part 1 Machine Learning](https://i.ytimg.com/vi/Nol1hVtLOSg/default.jpg)
![XGBoost ❌ LightGBM ❌ CatBoost ❌ Scikit-Learn GRADIENT BOOSTING Performance Compared](https://i.ytimg.com/vi/yO6gJM_t1Bw/default.jpg)
![](https://i.ytimg.com/vi/mOjkSYva23w/default.jpg)
![Can one do better than XGBoost? - Mateusz Susik](https://i.ytimg.com/vi/5CWwwtEM2TA/default.jpg)
![Ensemble Learning, Bootstrap Aggregating (Bagging) and Boosting](https://i.ytimg.com/vi/m-S9Hojj1as/default.jpg)
![Decision Trees, Boosting Trees, and Random Forests: A Side-by-Side Comparison](https://i.ytimg.com/vi/gehNcYRXs4M/default.jpg)
![Peter Prettenhofer - Gradient Boosted Regression Trees in scikit-learn](https://i.ytimg.com/vi/IXZKgIsZRm0/default.jpg)
![Using XGBoost for Time Series Forecasting in Python ❌ XGBoost for Stock Price Prediction Tutorial](https://i.ytimg.com/vi/4rikgkt4IcU/default.jpg)
![AdaBoost, Clearly Explained](https://i.ytimg.com/vi/LsK-xG1cLYA/default.jpg)
![Trevor Hastie - Gradient Boosting Machine Learning](https://i.ytimg.com/vi/wPqtzj5VZus/default.jpg)
![Decision Trees and Boosting, XGBoost | Two Minute Papers #55](https://i.ytimg.com/vi/0Xc9LIb_HTw/default.jpg)
![CatBoost - the new generation of gradient boosting - Anna Veronika Dorogush](https://i.ytimg.com/vi/8o0e-r0B5xQ/default.jpg)
![Pre-Modeling: Data Preprocessing and Feature Exploration in Python](https://i.ytimg.com/vi/V0u6bxQOUJ8/default.jpg)
![XGBoost: A Scalable Tree Boosting System](https://i.ytimg.com/vi/8Y-droPeKu8/default.jpg)
![Jaroslaw Szymczak - Gradient Boosting in Practice: a deep dive into xgboost](https://i.ytimg.com/vi/s3VmuVPfu0s/default.jpg)
![ML Interview Preparation- Important Interview Questions On Xgboost, Adaboost And Gradient Boost](https://i.ytimg.com/vi/EHewTwtYcgE/default.jpg)
![CatBoost: Fast Open-Source Gradient Boosting Library For GPU - Vasily Ershov](https://i.ytimg.com/vi/aHSw4BQkK3s/default.jpg)
![Peter Prettenhofer - Gradient Boosted Regression Trees in scikit-learn](https://i.ytimg.com/vi/-5l3g91NZfQ/default.jpg)
![Gradient Boosting Decision Tree Algorithm Explained](https://i.ytimg.com/vi/3zEqUSf5duw/default.jpg)