Загрузка страницы

[S+SSPR 2020] Graph Transformer: Learning Better Representations for Graph Neural Network

Boyuan Wang, Lixin Cui, Lu Bai, Edwin R. Hancock

Abstract.
Graph classifications are significant tasks for many real-world applications. Recently, Graph Neural Networks (GNNs) have achieved excellent performance on many graph classification tasks. However, most state-of-the-art GNNs face the challenge of the over-smoothing problem and cannot learn latent relations between distant vertices well. To overcome this problem, we develop a novel Graph Transformer (GT) unit to learn latent relations timely. In addition, we propose a mixed network to combine different methods of graph learning. We elucidate that the proposed GT unit can both learn distant latent connections well and form better representations for graphs. Moreover, the proposed Graph Transformer with Mixed Network (GTMN) can learn both local and global information simultaneously. Experiments on standard graph classification benchmarks demonstrate that our proposed approach performs better when compared with other competing methods.

Видео [S+SSPR 2020] Graph Transformer: Learning Better Representations for Graph Neural Network канала David Wang
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
14 января 2021 г. 21:17:05
00:07:34
Другие видео канала
AR Billiard Game with Unity + MRTKAR Billiard Game with Unity + MRTK
Яндекс.Метрика