Boosting - EXPLAINED!
REFERENCES
[1] A Short Introduction to Boosting: https://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf
[2] A Theory of the Learnable (Valiant, 1984): http://web.mit.edu/6.435/www/Valiant84.pdf. This introduced the PAC Learning model
[3] PAC Learning Model: https://www.youtube.com/watch?v=mztE3UHA8DU
[4] Cryptographic Limitations on Learning Boolean Formulae & Finite Automata (Kearns et al., 1988): https://www.cis.upenn.edu/~mkearns/papers/crypto.pdf (This paper defined weak learnability)
[5] The strength of weak learnability (Schapire, 1990): http://rob.schapire.net/papers/strengthofweak.pdf
[6] A gentle intro to weak learners: https://www.cs.ox.ac.uk/people/varun.kanade/teaching/AML-HT2017/lectures/lecture04.pdf
[7] Boosting a weak learning algorithm by majority (Freund, 1995): https://pdfs.semanticscholar.org/d620/946e24eee13bc3bdd5ceb0f90a3dc4bc4a54.pdf
[8] Adaptive Boosting (Section 4): http://rob.schapire.net/papers/FreundSc95.pdf
[9] Adaboost & overfitting discussion: https://stats.stackexchange.com/questions/20622/is-adaboost-less-or-more-prone-to-overfitting
[10] Gradient Boosting: https://statweb.stanford.edu/~jhf/ftp/trebst.pdf
[11] How boosting still learns even after training error hits 0: https://www.cc.gatech.edu/~isbell/tutorials/boostingmargins.pdf
[12] Difference between Adaboost & Gradient Boost: https://www.quora.com/What-is-the-difference-between-gradient-boosting-and-adaboost
[13] Adaboost Vs Gradient Boosting: https://subscription.packtpub.com/book/big_data_and_business_intelligence/9781788295758/4/ch04lvl1sec34/comparison-between-adaboosting-versus-gradient-boosting
[14] XGBoost (Main Paper): https://arxiv.org/abs/1603.02754
[15] Compressed Sparse Column (CSC) format used in storing data in XGboost: https://software.intel.com/en-us/mkl-developer-reference-c-sparse-blas-csc-matrix-storage-format
CODE
[1] Starter code with built in libraries: https://repl.it/@PulkitSharma1/Boosting-algorithms
IMAGE RESOURCES
[1] ConvNet: https://missinglink.ai/guides/convolutional-neural-networks/convolutional-neural-network-tutorial-basic-advanced/
HIPPY COWBOY MUSIC
[1] Cowboy Sting by Kevin MacLeod is licensed under a Creative Commons Attribution license (https://creativecommons.org/licenses/by/4.0/)
Source: http://incompetech.com/music/royalty-free/index.html?isrc=USUAN1400015
Artist: http://incompetech.com/
Видео Boosting - EXPLAINED! канала CodeEmporium
[1] A Short Introduction to Boosting: https://cseweb.ucsd.edu/~yfreund/papers/IntroToBoosting.pdf
[2] A Theory of the Learnable (Valiant, 1984): http://web.mit.edu/6.435/www/Valiant84.pdf. This introduced the PAC Learning model
[3] PAC Learning Model: https://www.youtube.com/watch?v=mztE3UHA8DU
[4] Cryptographic Limitations on Learning Boolean Formulae & Finite Automata (Kearns et al., 1988): https://www.cis.upenn.edu/~mkearns/papers/crypto.pdf (This paper defined weak learnability)
[5] The strength of weak learnability (Schapire, 1990): http://rob.schapire.net/papers/strengthofweak.pdf
[6] A gentle intro to weak learners: https://www.cs.ox.ac.uk/people/varun.kanade/teaching/AML-HT2017/lectures/lecture04.pdf
[7] Boosting a weak learning algorithm by majority (Freund, 1995): https://pdfs.semanticscholar.org/d620/946e24eee13bc3bdd5ceb0f90a3dc4bc4a54.pdf
[8] Adaptive Boosting (Section 4): http://rob.schapire.net/papers/FreundSc95.pdf
[9] Adaboost & overfitting discussion: https://stats.stackexchange.com/questions/20622/is-adaboost-less-or-more-prone-to-overfitting
[10] Gradient Boosting: https://statweb.stanford.edu/~jhf/ftp/trebst.pdf
[11] How boosting still learns even after training error hits 0: https://www.cc.gatech.edu/~isbell/tutorials/boostingmargins.pdf
[12] Difference between Adaboost & Gradient Boost: https://www.quora.com/What-is-the-difference-between-gradient-boosting-and-adaboost
[13] Adaboost Vs Gradient Boosting: https://subscription.packtpub.com/book/big_data_and_business_intelligence/9781788295758/4/ch04lvl1sec34/comparison-between-adaboosting-versus-gradient-boosting
[14] XGBoost (Main Paper): https://arxiv.org/abs/1603.02754
[15] Compressed Sparse Column (CSC) format used in storing data in XGboost: https://software.intel.com/en-us/mkl-developer-reference-c-sparse-blas-csc-matrix-storage-format
CODE
[1] Starter code with built in libraries: https://repl.it/@PulkitSharma1/Boosting-algorithms
IMAGE RESOURCES
[1] ConvNet: https://missinglink.ai/guides/convolutional-neural-networks/convolutional-neural-network-tutorial-basic-advanced/
HIPPY COWBOY MUSIC
[1] Cowboy Sting by Kevin MacLeod is licensed under a Creative Commons Attribution license (https://creativecommons.org/licenses/by/4.0/)
Source: http://incompetech.com/music/royalty-free/index.html?isrc=USUAN1400015
Artist: http://incompetech.com/
Видео Boosting - EXPLAINED! канала CodeEmporium
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Ensemble Learning, Bootstrap Aggregating (Bagging) and BoostingAdaBoost, Clearly Explained17. Learning: BoostingTransformer Neural Networks - EXPLAINED! (Attention is all you need)How to keep up with AI research?Support Vector Machines - THE MATH YOU SHOULD KNOWGradient Boosting Complete Maths Indepth Intuiton Explained| Machine Learning- Part2Batch Normalization - EXPLAINED!Boosting Machine Learning Tutorial | Adaptive Boosting, Gradient Boosting, XGBoost | Edureka193 - What is XGBoost and is it really better than Random Forest and Deep Learning?Support Vector Machines Part 1 (of 3): Main Ideas!!!BERT Neural Network - EXPLAINED!Trevor Hastie - Gradient Boosting Machine LearningROC and AUC, Clearly Explained!What is AdaBoost (BOOSTING TECHNIQUES)Better than Deep Learning: Gradient Boosting Machines (GBM)NFNets: High-Performance Large-Scale Image Recognition Without Normalization (ML Paper Explained)Deep Learning Crash Course for BeginnersDETR: End-to-End Object Detection with Transformers (Paper Explained)Convolutional Neural Networks | CNN | Kernel | Stride | Padding | Pooling | Flatten | Formula