Загрузка страницы

GraphSAGE to GraphBERT - Theory of Graph Neural Networks

You start w/ differentiable aggregator functions of GraphSAGE to permutation invariance of graphs, plus a mathematical presentation of convolutional, attentional and message passing Neural Networks.
Resulting in Transformers applied on Graphs, via Laplacian EigenVectors as positional encoding.

Presentation slides by Petar Velickovic (see link below), the author and presenter of the seminar Theory of GNN, 17 February 2021. All rights on these presentation slides belong to him.
Inductive Representation Learning on Large Graphs
https://arxiv.org/pdf/1706.02216v4.pdf

Slides on Theory of Graph Neural Networks available for you at:
https://petar-v.com/talks/GNN-Wednesday.pdf
* Thanks to Petar Velickovic to make informative slides (as pdf file) publicly available*

A Generalization of Transformer Networks to Graphs
https://arxiv.org/pdf/2012.09699.pdf

#GraphSAGE
#GraphBERT
#GraphNN

00:00 GraphSAGE
07:30 Aggregator Functions
08:30 Permutation Equivariance
12:20 GNN Aggregator
16:12 GNN Meta-structure
19:55 Transformers are GNN
21:15 Graph Laplacian
22:00 Graph Transformer

Видео GraphSAGE to GraphBERT - Theory of Graph Neural Networks канала code_your_own_AI
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
13 декабря 2021 г. 18:15:00
00:24:28
Яндекс.Метрика