A bio-inspired bistable recurrent cell allows for long-lasting memory (Paper Explained)
Even though LSTMs and GRUs solve the vanishing and exploding gradient problems, they have trouble learning to remember things over very long time spans. Inspired from bistability, a property of biological neurons, this paper constructs a recurrent cell with an inherent memory property, with only minimal modification to existing architectures.
OUTLINE:
0:00 - Intro & Overview
1:10 - Recurrent Neural Networks
6:00 - Gated Recurrent Unit
14:40 - Neuronal Bistability
22:50 - Bistable Recurrent Cell
31:00 - Neuromodulation
32:50 - Copy First Benchmark
37:35 - Denoising Benchmark
48:00 - Conclusion & Comments
Paper: https://arxiv.org/abs/2006.05252
Code: https://github.com/nvecoven/BRC
Abstract:
Recurrent neural networks (RNNs) provide state-of-the-art performances in a wide variety of tasks that require memory. These performances can often be achieved thanks to gated recurrent cells such as gated recurrent units (GRU) and long short-term memory (LSTM). Standard gated cells share a layer internal state to store information at the network level, and long term memory is shaped by network-wide recurrent connection weights. Biological neurons on the other hand are capable of holding information at the cellular level for an arbitrary long amount of time through a process called bistability. Through bistability, cells can stabilize to different stable states depending on their own past state and inputs, which permits the durable storing of past information in neuron state. In this work, we take inspiration from biological neuron bistability to embed RNNs with long-lasting memory at the cellular level. This leads to the introduction of a new bistable biologically-inspired recurrent cell that is shown to strongly improves RNN performance on time-series which require very long memory, despite using only cellular connections (all recurrent connections are from neurons to themselves, i.e. a neuron state is not influenced by the state of other neurons). Furthermore, equipping this cell with recurrent neuromodulation permits to link them to standard GRU cells, taking a step towards the biological plausibility of GRU.
Authors: Nicolas Vecoven, Damien Ernst, Guillaume Drion
Links:
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://discord.gg/4H8xxDF
BitChute: https://www.bitchute.com/channel/yannic-kilcher
Minds: https://www.minds.com/ykilcher
Видео A bio-inspired bistable recurrent cell allows for long-lasting memory (Paper Explained) канала Yannic Kilcher
OUTLINE:
0:00 - Intro & Overview
1:10 - Recurrent Neural Networks
6:00 - Gated Recurrent Unit
14:40 - Neuronal Bistability
22:50 - Bistable Recurrent Cell
31:00 - Neuromodulation
32:50 - Copy First Benchmark
37:35 - Denoising Benchmark
48:00 - Conclusion & Comments
Paper: https://arxiv.org/abs/2006.05252
Code: https://github.com/nvecoven/BRC
Abstract:
Recurrent neural networks (RNNs) provide state-of-the-art performances in a wide variety of tasks that require memory. These performances can often be achieved thanks to gated recurrent cells such as gated recurrent units (GRU) and long short-term memory (LSTM). Standard gated cells share a layer internal state to store information at the network level, and long term memory is shaped by network-wide recurrent connection weights. Biological neurons on the other hand are capable of holding information at the cellular level for an arbitrary long amount of time through a process called bistability. Through bistability, cells can stabilize to different stable states depending on their own past state and inputs, which permits the durable storing of past information in neuron state. In this work, we take inspiration from biological neuron bistability to embed RNNs with long-lasting memory at the cellular level. This leads to the introduction of a new bistable biologically-inspired recurrent cell that is shown to strongly improves RNN performance on time-series which require very long memory, despite using only cellular connections (all recurrent connections are from neurons to themselves, i.e. a neuron state is not influenced by the state of other neurons). Furthermore, equipping this cell with recurrent neuromodulation permits to link them to standard GRU cells, taking a step towards the biological plausibility of GRU.
Authors: Nicolas Vecoven, Damien Ernst, Guillaume Drion
Links:
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://discord.gg/4H8xxDF
BitChute: https://www.bitchute.com/channel/yannic-kilcher
Minds: https://www.minds.com/ykilcher
Видео A bio-inspired bistable recurrent cell allows for long-lasting memory (Paper Explained) канала Yannic Kilcher
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
SynFlow: Pruning neural networks without any data by iteratively conserving synaptic flowManifold Mixup: Better Representations by Interpolating Hidden StatesSupSup: Supermasks in Superposition (Paper Explained)The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural NetworksBYOL: Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning (Paper Explained)FixMatch: Simplifying Semi-Supervised Learning with Consistency and ConfidenceTUNIT: Rethinking the Truly Unsupervised Image-to-Image Translation (Paper Explained)Typical Decoding for Natural Language Generation (Get more human-like outputs from language models!)Linformer: Self-Attention with Linear Complexity (Paper Explained)Big Self-Supervised Models are Strong Semi-Supervised Learners (Paper Explained)RepNet: Counting Out Time - Class Agnostic Video Repetition Counting in the Wild (Paper Explained)JOIN ME for the NeurIPS 2020 Flatland Multi-Agent RL Challenge!Easily Run Linux On Android With AndroNix - Linux Distro on Android without rootDeconstructing Lottery Tickets: Zeros, Signs, and the Supermask (Paper Explained)DALL-E 2 by OpenAI is out! Live ReactionTransformers are RNNs: Fast Autoregressive Transformers with Linear Attention (Paper Explained)How I Read a Paper: Facebook's DETR (Video Tutorial)[News] Google’s medical AI was super accurate in a lab. Real life was a different story.Longformer: The Long-Document Transformer