Загрузка страницы

PyTorch Tutorial 03 - Gradient Calculation With Autograd

New Tutorial series about Deep Learning with PyTorch!
⭐ Check out Tabnine, the FREE AI-powered code completion tool I use to help me code faster: https://www.tabnine.com/?utm_source=youtube.com&utm_campaign=PythonEngineer *

In this part we learn how to calculate gradients using the autograd package in PyTorch.
This tutorial contains the following topics:

- requires_grad attribute for Tensors
- Computational graph
- Backpropagation (brief explanation)
- How to stop autograd from tracking history
- How to zero (empty) gradients

Part 03: Gradient Calculation With Autograd

📚 Get my FREE NumPy Handbook:
https://www.python-engineer.com/numpybook

📓 Notebooks available on Patreon:
https://www.patreon.com/patrickloeber

⭐ Join Our Discord : https://discord.gg/FHMg9tKFSN

If you enjoyed this video, please subscribe to the channel!

Official website:
https://pytorch.org/

Part 01:
https://youtu.be/EMXfZB8FVUA

You can find me here:
Website: https://www.python-engineer.com
Twitter: https://twitter.com/python_engineer
GitHub: https://github.com/python-engineer

#Python #DeepLearning #Pytorch

----------------------------------------------------------------------------------------------------------
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

Видео PyTorch Tutorial 03 - Gradient Calculation With Autograd канала Python Engineer
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
25 декабря 2019 г. 14:44:56
00:15:54
Яндекс.Метрика