Lecture 2.3: Backpropagation, a tensor view (DLVU)
lecturer: Peter Bloem
course website: https://dlvu.github.io
In this video, we work out the backpropagation algorithm in a vectorized version: that is purely in terms of basic linear algebra operations like matrix and vector multiplication. This helps us to express neural networks in a clean notation, and to accelarate their computation.
Видео Lecture 2.3: Backpropagation, a tensor view (DLVU) канала DLVU
course website: https://dlvu.github.io
In this video, we work out the backpropagation algorithm in a vectorized version: that is purely in terms of basic linear algebra operations like matrix and vector multiplication. This helps us to express neural networks in a clean notation, and to accelarate their computation.
Видео Lecture 2.3: Backpropagation, a tensor view (DLVU) канала DLVU
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
[archived] Lecture 1.2: A quick intro to AILecture 4.4 The bag of tricksLecture 4.2 Why does deep Learning work? (DLVU)Lecture 11.3: World ModelsLecture 6.3: variational autoencodersLecture 1.1: Neural networks[OLD] Lecture 2.4: Automatic Differentiation (DLVU)Lecture 7.2 Implicit models: GANs[OLD] Lecture 2.2: Backpropagation, scalar perspective (DLVU)Lecture 5.2 Recurrent Neural NetworksLecture 8.1b: Introduction - EmbeddingsLecture 1.2: Regression, classification and loss functionsLecture 2.1: A Review of Neural Networks (DLVU)Lecture 8.2: Graph and node embeddingLecture 6.2: Latent variable modelLecture 9.3: Gradient EstimationLecture 8.4: Application - query embeddingLecture 4.1 General Deep Learning practiceLecture 10.2: ARM & FlowsLecture 10.1: ARM & FlowsLecture 12.4 Scaling up (Mixed precision, Data-parallelism, FSDP)