Lesson 3 - Deep Learning for Coders (2020)
NB: We recommend watching these videos through https://course.fast.ai rather than directly on YouTube, to get access to the searchable transcript, interactive notebooks, setup guides, questionnaires, and so forth.
Today we finish creating and deploying our own app. We discuss data augmentation, and look at the most important types of augmentation used in modern computer vision models. We also see how fastai helps you process your images to get them ready for your model.
We look at building GUIs, both for interactive apps inside notebooks, and also for standalone web applications. We discuss how to deploy web applications that incorporate deep learning models. In doing so, we look at the pros and cons of different approaches, such as server-based and edge-device deployment.
Our final area for productionization is looking at what can go wrong, and how to avoid problems, and keep your data product working effectively in practice.
Then we skip over to chapter 4 of the book, and learn about the underlying math and code of Stochastic Gradient Descent, which lies at the heart of neural network training.
0:00 - Recap of Lesson 2 + What's next
1:08 - Resizing Images with DataBlock
8:46 - Data Augmentation and item_tfms vs batch_tfms
12:28 - Training your model, and using it to clean your data
18:07 - Turning your model into an online application
36:12 - Deploying to a mobile phone
38:13 - How to avoid disaster
50:59 - Unforeseen consequences and feedback loops
57:20 - End of Chapter 2 Recap + Blogging
1:04:09 - Starting MNIST from scratch
1:06:58 - untar_data and path explained
1:10:57 - Exploring at the MNIST data
1:12:05 - NumPy Array vs PyTorch Tensor
1:16:00 - Creating a simple baseline model
1:28:38 - Working with arrays and tensors
1:30:50 - Computing metrics with Broadcasting
1:39:46 - Stochastic Gradient Descent (SGD)
1:54:40 - End-to-end Gradient Descent example
2:01:56 - MNIST loss function
2:04:40 - Lesson 3 review
Видео Lesson 3 - Deep Learning for Coders (2020) канала Jeremy Howard
Today we finish creating and deploying our own app. We discuss data augmentation, and look at the most important types of augmentation used in modern computer vision models. We also see how fastai helps you process your images to get them ready for your model.
We look at building GUIs, both for interactive apps inside notebooks, and also for standalone web applications. We discuss how to deploy web applications that incorporate deep learning models. In doing so, we look at the pros and cons of different approaches, such as server-based and edge-device deployment.
Our final area for productionization is looking at what can go wrong, and how to avoid problems, and keep your data product working effectively in practice.
Then we skip over to chapter 4 of the book, and learn about the underlying math and code of Stochastic Gradient Descent, which lies at the heart of neural network training.
0:00 - Recap of Lesson 2 + What's next
1:08 - Resizing Images with DataBlock
8:46 - Data Augmentation and item_tfms vs batch_tfms
12:28 - Training your model, and using it to clean your data
18:07 - Turning your model into an online application
36:12 - Deploying to a mobile phone
38:13 - How to avoid disaster
50:59 - Unforeseen consequences and feedback loops
57:20 - End of Chapter 2 Recap + Blogging
1:04:09 - Starting MNIST from scratch
1:06:58 - untar_data and path explained
1:10:57 - Exploring at the MNIST data
1:12:05 - NumPy Array vs PyTorch Tensor
1:16:00 - Creating a simple baseline model
1:28:38 - Working with arrays and tensors
1:30:50 - Computing metrics with Broadcasting
1:39:46 - Stochastic Gradient Descent (SGD)
1:54:40 - End-to-end Gradient Descent example
2:01:56 - MNIST loss function
2:04:40 - Lesson 3 review
Видео Lesson 3 - Deep Learning for Coders (2020) канала Jeremy Howard
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
A Hackers' Guide to Language ModelsJeremy Howard on ABC Weekend BreakfastJeremy Howard and Joshua Browder discuss AI & Jobs with Piers MorganJeremy Howard demo for Mojo launchLesson 13: Deep Learning Foundations to Stable DiffusionLesson 14: Deep Learning Foundations to Stable DiffusionLesson 17: Deep Learning Foundations to Stable DiffusionLesson 12: Deep Learning Foundations to Stable DiffusionLesson 15: Deep Learning Foundations to Stable DiffusionLesson 23: Deep Learning Foundations to Stable DiffusionLesson 18: Deep Learning Foundations to Stable DiffusionLesson 19: Deep Learning Foundations to Stable DiffusionLesson 16: Deep Learning Foundations to Stable DiffusionLesson 25: Deep Learning Foundations to Stable DiffusionLesson 24: Deep Learning Foundations to Stable DiffusionTanishq Mathew Abraham - Their Life and Work Eps 1Lesson 9B - the math of diffusionLesson 10: Deep Learning Foundations to Stable Diffusion, 2022Lesson 9: Deep Learning Foundations to Stable Diffusion, 2022Lesson 9A 2022 - Stable Diffusion deep diveFast.ai APL study session 17