You Should Be Using Automatic Differentiation
WANT TO EXPERIENCE A TALK LIKE THIS LIVE?
Barcelona: https://www.datacouncil.ai/barcelona
New York City: https://www.datacouncil.ai/new-york-city
San Francisco: https://www.datacouncil.ai/san-francisco
Singapore: https://www.datacouncil.ai/singapore
Ryan Adams is a machine learning researcher at Twitter and a professor of computer science at Harvard. He co-founded Whetlab, a machine learning startup that was acquired by Twitter in 2015. He co-hosts the Talking Machines podcast.
A big part of machine learning is optimization of continuous functions. Whether for deep neural networks, structured prediction, or variational inference, machine learners spend a lot of time taking gradients and verifying them. It turns out, however, that computers are good at doing this kind of calculus automatically, and automatic differentiation tools are becoming more mainstream and easier to use. In his talk, Adams will give an overview of automatic differentiation, with a particular focus on Autograd. I will also give several vignettes about using Autograd to learn hyperparameters in neural networks, perform variational inference, and design new organic molecules.
FOLLOW DATA COUNCIL:
Twitter: https://twitter.com/DataCouncilAI
LinkedIn: https://www.linkedin.com/company/datacouncil-ai
Facebook: https://www.facebook.com/datacouncilai
Видео You Should Be Using Automatic Differentiation канала Data Council
Barcelona: https://www.datacouncil.ai/barcelona
New York City: https://www.datacouncil.ai/new-york-city
San Francisco: https://www.datacouncil.ai/san-francisco
Singapore: https://www.datacouncil.ai/singapore
Ryan Adams is a machine learning researcher at Twitter and a professor of computer science at Harvard. He co-founded Whetlab, a machine learning startup that was acquired by Twitter in 2015. He co-hosts the Talking Machines podcast.
A big part of machine learning is optimization of continuous functions. Whether for deep neural networks, structured prediction, or variational inference, machine learners spend a lot of time taking gradients and verifying them. It turns out, however, that computers are good at doing this kind of calculus automatically, and automatic differentiation tools are becoming more mainstream and easier to use. In his talk, Adams will give an overview of automatic differentiation, with a particular focus on Autograd. I will also give several vignettes about using Autograd to learn hyperparameters in neural networks, perform variational inference, and design new organic molecules.
FOLLOW DATA COUNCIL:
Twitter: https://twitter.com/DataCouncilAI
LinkedIn: https://www.linkedin.com/company/datacouncil-ai
Facebook: https://www.facebook.com/datacouncilai
Видео You Should Be Using Automatic Differentiation канала Data Council
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![The Simple Essence of Automatic Differentiation - Conal Elliott](https://i.ytimg.com/vi/ne99laPUxN4/default.jpg)
![Jarrett Revels: Forward-Mode Automatic Differentiation in Julia](https://i.ytimg.com/vi/jz0VpRK0MjU/default.jpg)
![Beyond Deep Learning - Differentiable Programming with Flux - Avik Sengupta | ODSC Europe 2019](https://i.ytimg.com/vi/FZ1wlzFp0r8/default.jpg)
![But what is a Neural Network? | Deep learning, chapter 1](https://i.ytimg.com/vi/aircAruvnKk/default.jpg)
![Deep Learning 2: Introduction to TensorFlow](https://i.ytimg.com/vi/JO0LwmIlWw0/default.jpg)
![Machine Learning with TensorFlow (GDD Europe '17)](https://i.ytimg.com/vi/2zWSr-3gkWY/default.jpg)
![What is Automatic Differentiation?](https://i.ytimg.com/vi/wG_nF1awSSY/default.jpg)
![Automatic Differentiation - A Revisionist History and the State of the Art - AD meets SDG and PLT](https://i.ytimg.com/vi/_ds0-daMESY/default.jpg)
![JAX: Accelerated Machine Learning Research | SciPy 2020 | VanderPlas](https://i.ytimg.com/vi/z-WSrQDXkuM/default.jpg)
![Automatic differentiation in Ruby](https://i.ytimg.com/vi/TI7mtWB4WiA/default.jpg)
![TensorFlow and deep reinforcement learning, without a PhD (Google I/O '18)](https://i.ytimg.com/vi/t1A3NTttvBA/default.jpg)
![Keynote: Automatic Differentiation for Dummies](https://i.ytimg.com/vi/FtnkqIsfNQc/default.jpg)
![AD-OCaml: Algorithmic Differentiation for OCaml](https://i.ytimg.com/vi/KUVHbVS-PN4/default.jpg)
![Machine Learning & Artificial Intelligence: Crash Course Computer Science #34](https://i.ytimg.com/vi/z-EtmaFJieY/default.jpg)
![Talk: Colin Carroll - Getting started with automatic differentiation](https://i.ytimg.com/vi/NG21KWZSiok/default.jpg)
![Derivative of a Matrix : Data Science Basics](https://i.ytimg.com/vi/e73033jZTCI/default.jpg)
![Autograd in Pytorch](https://i.ytimg.com/vi/dQOZkgZ5WpU/default.jpg)
![Swift for TensorFlow (Google I/O'19)](https://i.ytimg.com/vi/3fJsqGHhlVA/default.jpg)
![Petros Koumoutsakos: "Machine Learning for Fluid Mechanics"](https://i.ytimg.com/vi/gv20VsKqgpc/default.jpg)
![Phillip Carter- The Many Paths Towards Functional Programming Language Adoption- λC20 Global Edition](https://i.ytimg.com/vi/eHv6RssZluw/default.jpg)