Weaving together machine learning, theoretical physics, and neuroscience by Surya Ganguli
Surya Ganguli (Stanford University), "Weaving together machine learning, theoretical physics, and neuroscience"
https://ganguli-gang.stanford.edu/
Abstract: An exciting area of intellectual activity in this century may well revolve around a synthesis of machine learning, theoretical physics, and neuroscience. The unification of these fields will likely enable us to exploit the power of complex systems analysis, developed in theoretical physics and applied mathematics, to elucidate the design principles governing neural systems, both biological and artificial, and deploy these principles to develop better algorithms in machine learning. We will give several vignettes in this direction, including: (1) determining the best optimization problem to solve in order to perform regression in high dimensions; (2) finding exact solutions to the dynamics of generalization error in deep linear networks; (3) developing interpretable machine learning to derive and understand state of the art models of the retina; (4) analyzing and explaining the origins of hexagonal firing patterns in recurrent neural networks trained to path-integrate; (5) understanding the geometry and dynamics of high dimensional optimization in the classical limit of dissipative many-body quantum optimizers.
References:
M. Advani and S. Ganguli, Statistical mechanics of optimal convex inference in high dimensions, Physical Review X, 6, 031034, 2016.
M. Advani and S. Ganguli, An equivalence between high dimensional Bayes optimal inference and M-estimation, NeurIPS, 2016.
A.K. Lampinen and S. Ganguli, An analytic theory of generalization dynamics and transfer learning in deep linear networks, International Conference on Learning Representations (ICLR), 2019.
H. Tanaka, A. Nayebi, N. Maheswaranathan, L.M. McIntosh, S. Baccus, S. Ganguli, From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction, NeurIPS 2019.
S. Deny, J. Lindsey, S. Ganguli, S. Ocko, The emergence of multiple retinal cell types through efficient coding of natural movies, Neural Information Processing Systems (NeurIPS) 2018.
B. Sorscher, G. Mel, S. Ganguli, S. Ocko, A unified theory for the origin of grid cells through the lens of pattern formation, NeurIPS 2019.
Y. Bahri, J. Kadmon, J. Pennington, S. Schoenholz, J. Sohl-Dickstein, and S. Ganguli, Statistical mechanics of deep learning, Annual Reviews of Condensed Matter Physics, 2020.
Y. Yamamoto, T. Leleu, S. Ganguli and H. Mabuchi, Coherent Ising Machines: quantum optics and neural network perspectives, Applied Physics Letters 2020.
B.P. Marsh, Y, Guo, R.M. Kroeze, S. Gopalakrishnan, S. Ganguli, J. Keeling, B.L. Lev, Enhancing associative memory recall and storage capacity using confocal cavity QED, https://arxiv.org/abs/2009.01227
AAAI 2021 Spring Symposium on Combining Artificial Intelligence and Machine Learning with Physics Sciences, March 22-24, 2021 (https://sites.google.com/view/aaai-mlps)
Papers: https://sites.google.com/view/aaai-mlps/proceedings
Slides: https://sites.google.com/view/aaai-mlps/program
Видео Weaving together machine learning, theoretical physics, and neuroscience by Surya Ganguli канала MLPS - Combining AI and ML with Physics Sciences
https://ganguli-gang.stanford.edu/
Abstract: An exciting area of intellectual activity in this century may well revolve around a synthesis of machine learning, theoretical physics, and neuroscience. The unification of these fields will likely enable us to exploit the power of complex systems analysis, developed in theoretical physics and applied mathematics, to elucidate the design principles governing neural systems, both biological and artificial, and deploy these principles to develop better algorithms in machine learning. We will give several vignettes in this direction, including: (1) determining the best optimization problem to solve in order to perform regression in high dimensions; (2) finding exact solutions to the dynamics of generalization error in deep linear networks; (3) developing interpretable machine learning to derive and understand state of the art models of the retina; (4) analyzing and explaining the origins of hexagonal firing patterns in recurrent neural networks trained to path-integrate; (5) understanding the geometry and dynamics of high dimensional optimization in the classical limit of dissipative many-body quantum optimizers.
References:
M. Advani and S. Ganguli, Statistical mechanics of optimal convex inference in high dimensions, Physical Review X, 6, 031034, 2016.
M. Advani and S. Ganguli, An equivalence between high dimensional Bayes optimal inference and M-estimation, NeurIPS, 2016.
A.K. Lampinen and S. Ganguli, An analytic theory of generalization dynamics and transfer learning in deep linear networks, International Conference on Learning Representations (ICLR), 2019.
H. Tanaka, A. Nayebi, N. Maheswaranathan, L.M. McIntosh, S. Baccus, S. Ganguli, From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction, NeurIPS 2019.
S. Deny, J. Lindsey, S. Ganguli, S. Ocko, The emergence of multiple retinal cell types through efficient coding of natural movies, Neural Information Processing Systems (NeurIPS) 2018.
B. Sorscher, G. Mel, S. Ganguli, S. Ocko, A unified theory for the origin of grid cells through the lens of pattern formation, NeurIPS 2019.
Y. Bahri, J. Kadmon, J. Pennington, S. Schoenholz, J. Sohl-Dickstein, and S. Ganguli, Statistical mechanics of deep learning, Annual Reviews of Condensed Matter Physics, 2020.
Y. Yamamoto, T. Leleu, S. Ganguli and H. Mabuchi, Coherent Ising Machines: quantum optics and neural network perspectives, Applied Physics Letters 2020.
B.P. Marsh, Y, Guo, R.M. Kroeze, S. Gopalakrishnan, S. Ganguli, J. Keeling, B.L. Lev, Enhancing associative memory recall and storage capacity using confocal cavity QED, https://arxiv.org/abs/2009.01227
AAAI 2021 Spring Symposium on Combining Artificial Intelligence and Machine Learning with Physics Sciences, March 22-24, 2021 (https://sites.google.com/view/aaai-mlps)
Papers: https://sites.google.com/view/aaai-mlps/proceedings
Slides: https://sites.google.com/view/aaai-mlps/program
Видео Weaving together machine learning, theoretical physics, and neuroscience by Surya Ganguli канала MLPS - Combining AI and ML with Physics Sciences
Показать
Комментарии отсутствуют
Информация о видео
13 апреля 2021 г. 6:47:09
01:05:26
Другие видео канала
![Targeted use of deep learning for physics-informed model discovery by Nathan Kutz](https://i.ytimg.com/vi/oS7NWbSe000/default.jpg)
![Data-Driven Inverse Modeling with Incomplete Observations by Kailai Xu](https://i.ytimg.com/vi/0r9qekmZGqk/default.jpg)
![Nonlocal Physics-Informed Neural Networks - A Unified Framework for Nonlocal Models by Marth D'Elia](https://i.ytimg.com/vi/KSfkZzQMVAM/default.jpg)
![ADCME MPI: Distributed Machine Learning for Computational Engineering by Kailai Xu](https://i.ytimg.com/vi/Uc038qv4rWQ/default.jpg)
![Accelerating Simulation of Stiff Nonlinear Systems using Continuous-Time Echo State Nets, Rackauckas](https://i.ytimg.com/vi/3lM-Stc7z28/default.jpg)
![Self-Adaptive Physics-Informed Neural Networks using a Soft Attention Mechanism by Levi McClenny](https://i.ytimg.com/vi/UTC6cccEEnM/default.jpg)
![Physics Informed Deep Learning for Well Test Analysis by Kamalkumar Rathinasamy](https://i.ytimg.com/vi/HLgrSMoC21Y/default.jpg)
![Learning High-Dimensional Hilbert-Valued Functions With DNNs From Limited Data by Nick Dexter](https://i.ytimg.com/vi/vQChQSvJcqE/default.jpg)
![Combining Programmable Potentials and Neural Networks for Materials Problems by Ryan Mohr](https://i.ytimg.com/vi/yMuAxNvsnuM/default.jpg)
![Continuous Representation Of Molecules using Graph Variational Autoencoder by Mohammadamin Tavakoli](https://i.ytimg.com/vi/S4v2UTz0-QU/default.jpg)
![Permeability Prediction of Porous Media using CNN with Physical Properties by Hongkyu Yoon](https://i.ytimg.com/vi/746KJ35ZZDY/default.jpg)
![TextureVAE : Learning Interpretable Representations of Material Microstructures Using VAE by Avadhut](https://i.ytimg.com/vi/FUyUNwZLZhM/default.jpg)
![Deep learning for scientific computing: (closing) the gap between theory and practice by Ben Adcock](https://i.ytimg.com/vi/AElQ1huCNdU/default.jpg)
![Learning Potentials of Quantum Systems using Deep Neural Networks by Arijit Sehanobish](https://i.ytimg.com/vi/07wzW1YdPm4/default.jpg)
![GMLS-Nets: A Machine Learning Framework for Unstructured Data by Nathaniel Trask](https://i.ytimg.com/vi/bzjamKqPzOI/default.jpg)
![Discovery of Physics and Characterization of Microstructure with Bayesian Hidden Physics Models](https://i.ytimg.com/vi/ObOS0C70iZc/default.jpg)
![Greedy Fiedler Spectral Partitioning for Data-driven Discrete Exterior Calculus by Andy Huang](https://i.ytimg.com/vi/svHwO71URB0/default.jpg)
![Physics-Informed Spatiotemporal Deep Learning for Emulating Coupled Dynamical Systems by Diane Oyen](https://i.ytimg.com/vi/5ceUJxHZark/default.jpg)
![Accelerating high-fidelity combustion simulations with classification algorithms by Wai Tong Chung](https://i.ytimg.com/vi/nIMQm-aF968/default.jpg)
![Graph Networks with Physics-aware Knowledge Informed in Latent Space by Sungyong Seo](https://i.ytimg.com/vi/60-EiCPUPWY/default.jpg)
![Graph-Informed Neural Networks by Søren Taverniers](https://i.ytimg.com/vi/ywRuK1QlU9Q/default.jpg)