End-to-End: Automated Hyperparameter Tuning For Deep Neural Networks
In this video, I am going to show you how you can do #HyperparameterOptimization for a #NeuralNetwork automatically using Optuna. This is an end-to-end video in which I select a problem and design a neural network in #PyTorch and then I find the optimal number of layers, drop out, learning rate, and other parameters using Optuna.
The dataset used in this video can be found here: https://www.kaggle.com/c/lish-moa
Please subscribe and like the video to help me keep motivated to make awesome videos like this one. :)
00:00 Introduction
01:56 Dataset class
06:17 Start with train.py
08:19 Cross-validation folds
13:38 Reading the data
24:10 Engine
29:48 Model
35:10 Add model and engine to training
43:05 Optuna
49:02 Start tuning with Optuna
52:50 Training, suggestions and outro
To buy my book, Approaching (Almost) Any Machine Learning problem, please visit: https://bit.ly/buyaaml
Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek
Instagram: https://instagram.com/abhi4ml
Видео End-to-End: Automated Hyperparameter Tuning For Deep Neural Networks канала Abhishek Thakur
The dataset used in this video can be found here: https://www.kaggle.com/c/lish-moa
Please subscribe and like the video to help me keep motivated to make awesome videos like this one. :)
00:00 Introduction
01:56 Dataset class
06:17 Start with train.py
08:19 Cross-validation folds
13:38 Reading the data
24:10 Engine
29:48 Model
35:10 Add model and engine to training
43:05 Optuna
49:02 Start tuning with Optuna
52:50 Training, suggestions and outro
To buy my book, Approaching (Almost) Any Machine Learning problem, please visit: https://bit.ly/buyaaml
Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek
Instagram: https://instagram.com/abhi4ml
Видео End-to-End: Automated Hyperparameter Tuning For Deep Neural Networks канала Abhishek Thakur
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Ensembling, Blending & StackingHyperparameter Optimization: This Tutorial Is All You NeedOptuna: A Define-by-Run Hyperparameter Optimization Framework | SciPy Japan | Shotaro Sano, et alLecture 16.3 — Bayesian optimization of hyper parameters — [ Deep Learning | Hinton | UofT ]Anyscale Academy: Ray Tune & Serve, July 22, 2020Best computer vision competitions on Kaggle (for beginners)Eric J. Ma - An Attempt At Demystifying Bayesian Deep LearningVariational AutoencodersKeras Tuner Hyperparameter Tuning-How To Select Hidden Layers And Number of Hidden Neurons In ANNOptimizers - EXPLAINED!Auto-Tuning Hyperparameters with Optuna and PyTorchJeremy Howard: Deep Learning Frameworks - TensorFlow, PyTorch, fast.ai | AI Podcast ClipsOptuna: A Define by Run Hyperparameter Optimization Framework | SciPy 2019 |Stefan Otte: Deep Neural Networks with PyTorch | PyData Berlin 2018NLP | Fine Tuning BERT to perform Spam ClassificationDropout Regularization | Deep Learning Tutorial 20 (Tensorflow2.0, Keras & Python)RISE Camp 2019: 04. Introduction to Hyperparameter Tuning (Richard Liaw)Gradient descent, how neural networks learn | Deep learning, chapter 2Optimizing Neural Network Structures with Keras-Tuner