Week 13 – Lecture: Graph Convolutional Networks (GCNs)
Course website: http://bit.ly/pDL-home
Playlist: http://bit.ly/pDL-YouTube
Speaker: Xavier Bresson
Week 13: http://bit.ly/pDL-en-13
0:00:00 – Week 13 – Lecture
LECTURE Part A: http://bit.ly/pDL-en-13-1
In this section, we discuss the architecture and convolution of traditional convolutional neural networks. Then we extend to the graph domain. We understand the characteristics of graph and define the graph convolution. Finally, we introduce spectral graph convolutional neural networks and discuss how to perform spectral convolution.
0:00:50 – Architecture of Traditional ConvNets
0:13:11 – Convolution of Traditional ConvNets
0:25:29 – Spectral Convolution
LECTURE Part B: http://bit.ly/pDL-en-13-2
This section covers the complete spectrum of Graph Convolutional Networks (GCNs), starting with the implementation of Spectral Convolution through Spectral Networks. It then provides insights on applicability of the other convolutional definition of Template Matching to graphs, leading to Spatial networks. Various architectures employing the two approaches are detailed out with their corresponding pros & cons, experiments, benchmarks and applications.
0:44:30 – Spectral GCNs
1:06:04 – Template Matching, Isotropic GCNs and Benchmarking GNNs
1:33:06 – Anisotropic GCNs and Conclusion
Видео Week 13 – Lecture: Graph Convolutional Networks (GCNs) канала Alfredo Canziani
Playlist: http://bit.ly/pDL-YouTube
Speaker: Xavier Bresson
Week 13: http://bit.ly/pDL-en-13
0:00:00 – Week 13 – Lecture
LECTURE Part A: http://bit.ly/pDL-en-13-1
In this section, we discuss the architecture and convolution of traditional convolutional neural networks. Then we extend to the graph domain. We understand the characteristics of graph and define the graph convolution. Finally, we introduce spectral graph convolutional neural networks and discuss how to perform spectral convolution.
0:00:50 – Architecture of Traditional ConvNets
0:13:11 – Convolution of Traditional ConvNets
0:25:29 – Spectral Convolution
LECTURE Part B: http://bit.ly/pDL-en-13-2
This section covers the complete spectrum of Graph Convolutional Networks (GCNs), starting with the implementation of Spectral Convolution through Spectral Networks. It then provides insights on applicability of the other convolutional definition of Template Matching to graphs, leading to Spatial networks. Various architectures employing the two approaches are detailed out with their corresponding pros & cons, experiments, benchmarks and applications.
0:44:30 – Spectral GCNs
1:06:04 – Template Matching, Isotropic GCNs and Benchmarking GNNs
1:33:06 – Anisotropic GCNs and Conclusion
Видео Week 13 – Lecture: Graph Convolutional Networks (GCNs) канала Alfredo Canziani
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Week 14 – Lecture: Structured prediction with energy based modelsWeek 13 – Practicum: Graph Convolutional Neural Networks (GCN)Graph Convolutional Networks (GCNs) made simpleGraph Node Embedding Algorithms (Stanford - Fall 2019)Graph Convolutional Networks using only NumPyGraph Convolutional Networks (16 Nov 2020)How convolutional neural networks work, in depthBlack Holes and the Fundamental Laws of Physics - with Jerome GauntlettGraph Representation Learning (Stanford university)05L – Joint embedding method and latent variable energy based models (LV-EBMs)Week 7 – Lecture: Energy based models and self-supervised learningIntro to graph neural networks (ML Tech Talks)Deep learning on graphs: successes, challenges, and next steps | Graph Neural NetworksMLT __init__ Session #2: DeepLab — Semantic Image Segmentation13L – Optimisation for Deep LearningGenerating Songs With Neural Networks (Neural Composer)01L – Gradient descent and the backpropagation algorithmThe Unreasonable Effectiveness of Spectral Graph Theory: A Confluence of Algorithms, Geometry & ...Tutorial 22- Padding in Convolutional Neural Network