From Equivariance to Naturality (Dr. Taco Cohen)
LOGML Summer School 2022
Talk Title: From Equivariance to Naturality
Abstract: In this talk I will explain how groups, representations, and equivariant maps, the fundamental concepts of geometric deep learning, are special cases of the concepts category, functor, and natural transformation. Like equivariant maps, natural transformations capture the idea that the way we process an input should be essentially independent of which one of a number of equivalent (isomorphic) ways to encode an input we choose. Being more general, the categorical concepts open up new possibilities for "structure preserving machine learning" beyond what is currently considered in geometric DL. We will discuss examples such as natural graph networks and natural causal models.
Speaker Bio: Taco Cohen is a machine learning research scientist at Qualcomm AI Research in Amsterdam. He obtained his PhD at the University of Amsterdam, supervised by prof. Max Welling. He was a co-founder of Scyfer, a company focussed on active deep learning, acquired by Qualcomm in 2017. He holds a BSc in theoretical computer science from Utrecht University and a MSc in artificial intelligence from the University of Amsterdam (both cum laude). His research is focussed on understanding and improving deep representation learning, in particular learning of equivariant and disentangled representations, data-efficient deep learning, learning on non-Euclidean domains, and applications of group representation theory and non-commutative harmonic analysis, as well as deep learning based source compression. He has done internships at Google Deepmind (working with Geoff Hinton) and OpenAI. He received the 2014 University of Amsterdam thesis prize, a Google PhD Fellowship, ICLR 2018 best paper award for “Spherical CNNs”, and was named one of 35 innovators under 35 in Europe by MIT in 2018.
Видео From Equivariance to Naturality (Dr. Taco Cohen) канала LOGML Summer School
Talk Title: From Equivariance to Naturality
Abstract: In this talk I will explain how groups, representations, and equivariant maps, the fundamental concepts of geometric deep learning, are special cases of the concepts category, functor, and natural transformation. Like equivariant maps, natural transformations capture the idea that the way we process an input should be essentially independent of which one of a number of equivalent (isomorphic) ways to encode an input we choose. Being more general, the categorical concepts open up new possibilities for "structure preserving machine learning" beyond what is currently considered in geometric DL. We will discuss examples such as natural graph networks and natural causal models.
Speaker Bio: Taco Cohen is a machine learning research scientist at Qualcomm AI Research in Amsterdam. He obtained his PhD at the University of Amsterdam, supervised by prof. Max Welling. He was a co-founder of Scyfer, a company focussed on active deep learning, acquired by Qualcomm in 2017. He holds a BSc in theoretical computer science from Utrecht University and a MSc in artificial intelligence from the University of Amsterdam (both cum laude). His research is focussed on understanding and improving deep representation learning, in particular learning of equivariant and disentangled representations, data-efficient deep learning, learning on non-Euclidean domains, and applications of group representation theory and non-commutative harmonic analysis, as well as deep learning based source compression. He has done internships at Google Deepmind (working with Geoff Hinton) and OpenAI. He received the 2014 University of Amsterdam thesis prize, a Google PhD Fellowship, ICLR 2018 best paper award for “Spherical CNNs”, and was named one of 35 innovators under 35 in Europe by MIT in 2018.
Видео From Equivariance to Naturality (Dr. Taco Cohen) канала LOGML Summer School
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
LOGML 2021: Intro session + Michael Bronstein: Geometric Deep Learning: from Euclid to Drug DesignBack soon!LOGML - Niloy Mitra: Deep 3D Generative ModelingLOGML - Maks Ovsjanikov: Robust learning-based methods for shape correspondenceKeynote: Computational Topology and Applications to Biological Data (Prof. Heather Harrington)LOGML + WiC Tutorial: Getting started on graph MLLOGML - Marinka Zitnik: Graph Representation Learning for Biomedical DiscoveryLOGML - Martin RumpfBack soon!LOGML - Sanja FidlerLOGML 2022 IS COMING!Type-checking Graph Neural Networks (Dr. Petar Veličković)LOGML - Gabriele SteidlTutorial: PyTorch Geometric (Jianxuan You, Rex Ying)Tutorial: Topological Data Analysis (Katherine Benjamin)LOGML - Smita KrishnaswamyFisher Information Geometry of Beta and Dirichlet Distributions (Dr. Alice Le Brigant)LOGML - Michael Betancourt: Unravelling A Geometric ConspiracyLOGML - Thomas Kipf: Relational Structure DiscoveryUniverses as Big-Data: Physics, Geometry & AI (Prof. Yang-Hui He)