Computation and Learning with Assemblies of Neurons
Prof. Santosh Vempala, Georgia Tech
Abstract: Despite great advances in ML, and in our understanding of the brain at the level of neurons, synapses, and neural circuits, we still have no satisfactory explanation for the brain's performance in perception, cognition, language, memory, behavior; as Nobel laureate Richard Axel put it, ``we have no logic for translating neural activity into thought and action''. The Assembly Calculus (AC) is a framework to fill this gap, a computational model whose basic data type is the assembly, a large subset of neurons whose simultaneous excitation is tantamount to the subject's thinking of an object, idea, episode, or word. The AC provides a repertoire of operations ("project", "reciprocal-project", "associate", "pattern-complete", etc.) whose implementation relies only on Hebbian plasticity and inhibition, and encompasses a complete computational system, thereby enabling complex function. Very recently, it has been shown, rigorously and in simulation, that the AC can learn to classify samples from well-separated classes. For basic concept classes in high dimension, an assembly can be formed and recalled for each class, and these assemblies are distinguishable as long as the input classes are sufficiently separated. Viewed as a learning algorithm, this mechanism is entirely online, generalizes from very few samples, and requires only mild supervision --- all attributes expected of a brain-like mechanism. The talk will highlight several fascinating questions that arise, from the convergence of assemblies to their unexpected generalization abilities.
This is joint work with Christos Papadimitriou, Max Dabagia, Mirabel Reid and Dan Mitropolsky.
Видео Computation and Learning with Assemblies of Neurons канала MITCBMM
Abstract: Despite great advances in ML, and in our understanding of the brain at the level of neurons, synapses, and neural circuits, we still have no satisfactory explanation for the brain's performance in perception, cognition, language, memory, behavior; as Nobel laureate Richard Axel put it, ``we have no logic for translating neural activity into thought and action''. The Assembly Calculus (AC) is a framework to fill this gap, a computational model whose basic data type is the assembly, a large subset of neurons whose simultaneous excitation is tantamount to the subject's thinking of an object, idea, episode, or word. The AC provides a repertoire of operations ("project", "reciprocal-project", "associate", "pattern-complete", etc.) whose implementation relies only on Hebbian plasticity and inhibition, and encompasses a complete computational system, thereby enabling complex function. Very recently, it has been shown, rigorously and in simulation, that the AC can learn to classify samples from well-separated classes. For basic concept classes in high dimension, an assembly can be formed and recalled for each class, and these assemblies are distinguishable as long as the input classes are sufficiently separated. Viewed as a learning algorithm, this mechanism is entirely online, generalizes from very few samples, and requires only mild supervision --- all attributes expected of a brain-like mechanism. The talk will highlight several fascinating questions that arise, from the convergence of assemblies to their unexpected generalization abilities.
This is joint work with Christos Papadimitriou, Max Dabagia, Mirabel Reid and Dan Mitropolsky.
Видео Computation and Learning with Assemblies of Neurons канала MITCBMM
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Birth of a Neural Circuit / Cell, October 3, 2019 (Vol. 179, Issue 2)](https://i.ytimg.com/vi/B4gNamS8Ars/default.jpg)
![Blue Brain Seminar - Dendrites of human neurons: old bugs and new features.](https://i.ytimg.com/vi/OifJ4A1xvA0/default.jpg)
![](https://i.ytimg.com/vi/s5zmeyTF-S8/default.jpg)
![Chen Institute Symposium 2021 - Larry Abbott and Gaby Maimon](https://i.ytimg.com/vi/mQc1sNumTp8/default.jpg)
![Attention Approximates Sparse Distributed Memory](https://i.ytimg.com/vi/THIIk7LR9_8/default.jpg)
![Gödel's Incompleteness Theorem - Numberphile](https://i.ytimg.com/vi/O4ndIDcDSGc/default.jpg)
![Talk: Adding dendrites to spiking neural networks in Brian 2 using Dendrify](https://i.ytimg.com/vi/KShv1pvPZlE/default.jpg)
![Professor Alberto Salleo: Materials Science at Stanford: The beginning of the next century](https://i.ytimg.com/vi/SG_S9Q1nP3E/default.jpg)
![Towards a mechanistic understanding of compositionality](https://i.ytimg.com/vi/7TkZhCNsqZ4/default.jpg)
![A Roadmap for Reverse-Architecting the Brain’s Neocortex](https://i.ytimg.com/vi/HIP9OHEZnmc/default.jpg)
![Deep Learning in Life Sciences - Lecture 01 - Course Intro, AI, ML (Spring 2021)](https://i.ytimg.com/vi/0jWOZoTsYzI/default.jpg)
![Uncommon Knowledge classic: The Sixties with Hitchens and William F. Buckley](https://i.ytimg.com/vi/FIbdbrN9cwo/default.jpg)
![Sparsity and the L1 Norm](https://i.ytimg.com/vi/76B5cMEZA4Y/default.jpg)
![Mushrooms as Medicine with Paul Stamets at Exponential Medicine](https://i.ytimg.com/vi/7agK0nkiZpA/default.jpg)
![CBMM Panel Discussion: Is the theory of Deep Learning relevant to applications?](https://i.ytimg.com/vi/WnzWX0vGYvo/default.jpg)
![Do schools kill creativity? | Sir Ken Robinson](https://i.ytimg.com/vi/iG9CE55wbtY/default.jpg)
![Parallel systems for social and spatial reasoning within the brain's apex network](https://i.ytimg.com/vi/YyFFo3IUnIA/default.jpg)
![Selective responses to faces, scenes, and bodies in the ventral visual pathway of infants](https://i.ytimg.com/vi/WlwG3CoaCAY/default.jpg)
![Improve Your Memory with Mnemonic Techniques from 8x Memory Champion - Boris Nikolai Konrad #25](https://i.ytimg.com/vi/I7QTd8ASD2g/default.jpg)
![Liquid Neural Networks](https://i.ytimg.com/vi/IlliqYiRhMU/default.jpg)