EI Seminar - Grey Yang - Tuning GPT-3 on a Single GPU via Zero-Shot Hyperparameter Transfer
ABSTRACT: You can’t train GPT-3 on a single GPU, much less tune its hyperparameters (HPs)…or so it seems. I’m here to tell you this is not true: you *can* tune its HPs on a single GPU even if you can’t train it that way! In the first half of this talk, I’ll describe how, in the so-called maximal update parametrization (abbreviated µP), narrow and wide neural networks share the same set of optimal HPs. This lets us tune any large model by just tuning a small version of it — we call this µTransfer. In particular, this allowed us to tune the 6.7 billion parameter version of GPT-3 using only 7% of its pretraining compute budget, and, with some asterisks, we get a performance comparable to the original GPT-3 model with twice the parameter count. In the second half of this talk, I’ll discuss the theoretical reason µP has this special property and the connection to the study of infinite-width neural networks and, more generally, the theory of Tensor Programs. The first half will target general practitioners or empirical researchers in machine learning, while the second half targets those who are more theoretically curious. This talk is based on http://arxiv.org/abs/2203.03466.
BIO: Greg Yang is a researcher at Microsoft Research in Redmond, Washington. He joined MSR after he obtained Bachelor’s in Mathematics and Master’s degrees in Computer Science from Harvard University, respectively advised by ST Yau and Alexander Rush. He won the Hoopes prize at Harvard for best undergraduate thesis as well as Honorable Mention for the AMS-MAA-SIAM Morgan Prize, the highest honor in the world for an undergraduate in mathematics. He gave an invited talk at the International Congress of Chinese Mathematicians 2019.
Видео EI Seminar - Grey Yang - Tuning GPT-3 on a Single GPU via Zero-Shot Hyperparameter Transfer канала MIT Embodied Intelligence
BIO: Greg Yang is a researcher at Microsoft Research in Redmond, Washington. He joined MSR after he obtained Bachelor’s in Mathematics and Master’s degrees in Computer Science from Harvard University, respectively advised by ST Yau and Alexander Rush. He won the Hoopes prize at Harvard for best undergraduate thesis as well as Honorable Mention for the AMS-MAA-SIAM Morgan Prize, the highest honor in the world for an undergraduate in mathematics. He gave an invited talk at the International Congress of Chinese Mathematicians 2019.
Видео EI Seminar - Grey Yang - Tuning GPT-3 on a Single GPU via Zero-Shot Hyperparameter Transfer канала MIT Embodied Intelligence
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![MIT EI Seminar - Laura Schulz - Curiouser and curiouser: why we make problems for ourselves](https://i.ytimg.com/vi/1l0u5gctDP4/default.jpg)
![EI Seminar - Graham Neubig - Learning to Explain and Explaining to Learn](https://i.ytimg.com/vi/CtcP5bvODzY/default.jpg)
![EI Seminar - Martin Riedmiller - Learning Controllers - From Engineering to AGI](https://i.ytimg.com/vi/Pno8xsrgWA4/default.jpg)
![EI Seminar Livestream - Max Tegmark](https://i.ytimg.com/vi/aDaOuBP-jN4/default.jpg)
![EI Seminar - Recent papers in Embodied Intelligence](https://i.ytimg.com/vi/wcVejqmb1mQ/default.jpg)
![EI Seminar - Beomjoon Kim - Making Robots See and Manipulate](https://i.ytimg.com/vi/GZ-oiwOeRc8/default.jpg)
![EI Seminar - Marco Pavone - Building Trust in AI for Autonomous Vehicles](https://i.ytimg.com/vi/HjOt-4k6haI/default.jpg)
![EI Seminar - Jacob Andreas - Good Old-fashioned LLMs (or, Autoformalizing the World)](https://i.ytimg.com/vi/_TrKARhF5cI/default.jpg)
![EI Seminar - Maurice Fallon - Multi-Sensor Robot Navigation and Subterranean Exploration](https://i.ytimg.com/vi/4D4TbI1gGIg/default.jpg)
![EI Seminar - Chad Jenkins - Semantic Robot Programming... and Maybe Making the Worlda Better Place](https://i.ytimg.com/vi/UaTq6ojGuYo/default.jpg)
![EI Seminar - Joydeep Biswas](https://i.ytimg.com/vi/0vPNN0J8M44/default.jpg)
![MIT EI Seminar - Lerrel Pinto - Diverse data and efficient algorithms for robot learning](https://i.ytimg.com/vi/tRcwyC-ivMQ/default.jpg)
![EI Seminar - Yuan Gong - Audio Large Language Models: From Sound Perception to Understanding](https://i.ytimg.com/vi/uqsW2eK-Rms/default.jpg)
![Lawson Wong - High-Level Guidance for Generalizable Reinforcement Learning](https://i.ytimg.com/vi/8KGbtpkMBZc/default.jpg)
![EI Seminar - Monroe Kennedy - Collaborative Robotics: From Dexterity to Teammate Prediction](https://i.ytimg.com/vi/ii8ZNXaZ0hg/default.jpg)
![EI Seminar - Rob Fergus - Data Augmentation for Image-Based Reinforcement Learning](https://i.ytimg.com/vi/Ny2CpgPrtB8/default.jpg)
![EI Seminar - Jacob Steinhardt - Large Language Models as Statisticians](https://i.ytimg.com/vi/1m_fCzB__Oo/default.jpg)
![EI Seminar - Oriol Vinyals - The Deep Learning Toolbox: from AlphaFold to AlphaCode](https://i.ytimg.com/vi/dOlbnrsQy_I/default.jpg)
![Daniel Wolpert - Computational principles underlying the learning of sensorimotor repertoires](https://i.ytimg.com/vi/wp3c1E6oCTM/default.jpg)
![EI Seminar - Jeannette Bohg - Scaling Robot Learning for Long-Horizon Manipulation Tasks](https://i.ytimg.com/vi/Ca-CxLZ2mq8/default.jpg)