Machine Learning with JAX - From Zero to Hero | Tutorial #1
❤️ Become The AI Epiphany Patreon ❤️
https://www.patreon.com/theaiepiphany
👨👩👧👦 Join our Discord community 👨👩👧👦
https://discord.gg/peBrCpheKE
With this video I'm kicking off a series of tutorials on JAX!
JAX is a powerful and increasingly more popular ML library built by the Google Research team. The 2 most popular deep learning frameworks built on top of JAX are Haiku (DeepMInd) and Flax (Google Research).
In this video I cover the basics as well as the nitty-gritty details of jit, grad, vmap, and various other idiosyncrasies of JAX.
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ JAX GitHub: https://github.com/google/jax
✅ JAX docs: https://jax.readthedocs.io/
✅ My notebook: https://github.com/gordicaleksa/get-started-with-JAX
✅ Useful video on autodiff: https://www.youtube.com/watch?v=wG_nF1awSSY
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00:00 What is JAX? JAX ecosystem
00:03:35 JAX basics
00:10:05 JAX is accelerator agnostic
00:15:00 jit explained
00:17:45 grad explained
00:27:25 The power of JAX autodiff (Hessians and beyond)
00:31:00 vmap explained
00:36:50 JAX API (NumPy, lax, XLA)
00:39:40 The nitty-gritty details of jit
00:46:55 Static arguments
00:50:05 Gotcha 1: Pure functions
00:56:00 Gotcha 2: In-Place Updates
00:57:35 Gotcha 3: Out-of-Bounds Indexing
00:59:55 Gotcha 4: Non-Array Inputs
01:01:50 Gotcha 5: Random Numbers
01:09:40 Gotcha 6: Control Flow
01:13:45 Gotcha 7: NaNs and float32
02:15:25 Quick summary
02:16:00 Conclusion: who should be using JAX?
02:17:10 Outro
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany - https://www.patreon.com/theaiepiphany
One-time donation - https://www.paypal.com/paypalme/theaiepiphany
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Bartłomiej Danek
Zvonimir Sabljic
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💼 LinkedIn - https://www.linkedin.com/in/aleksagordic/
🐦 Twitter - https://twitter.com/gordic_aleksa
👨👩👧👦 Discord - https://discord.gg/peBrCpheKE
📺 YouTube - https://www.youtube.com/c/TheAIEpiphany/
📚 Medium - https://gordicaleksa.medium.com/
💻 GitHub - https://github.com/gordicaleksa
📢 AI Newsletter - https://aiepiphany.substack.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#jax #machinelearning #framework
Видео Machine Learning with JAX - From Zero to Hero | Tutorial #1 канала Aleksa Gordić - The AI Epiphany
https://www.patreon.com/theaiepiphany
👨👩👧👦 Join our Discord community 👨👩👧👦
https://discord.gg/peBrCpheKE
With this video I'm kicking off a series of tutorials on JAX!
JAX is a powerful and increasingly more popular ML library built by the Google Research team. The 2 most popular deep learning frameworks built on top of JAX are Haiku (DeepMInd) and Flax (Google Research).
In this video I cover the basics as well as the nitty-gritty details of jit, grad, vmap, and various other idiosyncrasies of JAX.
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ JAX GitHub: https://github.com/google/jax
✅ JAX docs: https://jax.readthedocs.io/
✅ My notebook: https://github.com/gordicaleksa/get-started-with-JAX
✅ Useful video on autodiff: https://www.youtube.com/watch?v=wG_nF1awSSY
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00:00 What is JAX? JAX ecosystem
00:03:35 JAX basics
00:10:05 JAX is accelerator agnostic
00:15:00 jit explained
00:17:45 grad explained
00:27:25 The power of JAX autodiff (Hessians and beyond)
00:31:00 vmap explained
00:36:50 JAX API (NumPy, lax, XLA)
00:39:40 The nitty-gritty details of jit
00:46:55 Static arguments
00:50:05 Gotcha 1: Pure functions
00:56:00 Gotcha 2: In-Place Updates
00:57:35 Gotcha 3: Out-of-Bounds Indexing
00:59:55 Gotcha 4: Non-Array Inputs
01:01:50 Gotcha 5: Random Numbers
01:09:40 Gotcha 6: Control Flow
01:13:45 Gotcha 7: NaNs and float32
02:15:25 Quick summary
02:16:00 Conclusion: who should be using JAX?
02:17:10 Outro
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany - https://www.patreon.com/theaiepiphany
One-time donation - https://www.paypal.com/paypalme/theaiepiphany
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Bartłomiej Danek
Zvonimir Sabljic
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💼 LinkedIn - https://www.linkedin.com/in/aleksagordic/
🐦 Twitter - https://twitter.com/gordic_aleksa
👨👩👧👦 Discord - https://discord.gg/peBrCpheKE
📺 YouTube - https://www.youtube.com/c/TheAIEpiphany/
📚 Medium - https://gordicaleksa.medium.com/
💻 GitHub - https://github.com/gordicaleksa
📢 AI Newsletter - https://aiepiphany.substack.com/
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#jax #machinelearning #framework
Видео Machine Learning with JAX - From Zero to Hero | Tutorial #1 канала Aleksa Gordić - The AI Epiphany
Показать
Комментарии отсутствуют
Информация о видео
31 октября 2021 г. 19:33:33
01:17:57
Другие видео канала
![Feed-forward method | Neural Style Transfer #5](https://i.ytimg.com/vi/lOR-LncQlk8/default.jpg)
![10k subscribers | joining Google DeepMind, updates, AMA](https://i.ytimg.com/vi/QaIDMuGbqt0/default.jpg)
![Day 25: Open NLLB - filtering HBS (Pt 2)](https://i.ytimg.com/vi/wFOSNFHVeKQ/default.jpg)
![Fastformer: Additive Attention Can Be All You Need | Paper Explained](https://i.ytimg.com/vi/Ich5TIvdYRE/default.jpg)
![Day 24: Open NLLB - back from China, analyzing spikes, preparing HBS run (Pt 2)](https://i.ytimg.com/vi/jgt-INU2WLw/default.jpg)
![Day 6: Meta NLLB - data filtering (Pt. 4)](https://i.ytimg.com/vi/HqY5pYAHvDw/default.jpg)
![Day 24: Open NLLB - back from China, filtering HBS data (Pt 3)](https://i.ytimg.com/vi/nlzHe4A2vwE/default.jpg)
![Day 14: Open NLLB - Eval of our first run (English, Turkish, Hindi) (Pt 2.)](https://i.ytimg.com/vi/BYtGuM8Y2Vs/default.jpg)
![The Vesuvius challenge breakthrough with Luke Farritor](https://i.ytimg.com/vi/Bb2MEngbx7Q/default.jpg)
![Day 6: Meta NLLB - data filtering (Pt. 3)](https://i.ytimg.com/vi/wGnEn4gOCAs/default.jpg)
![DeepMind's Android RL Environment - AndroidEnv](https://i.ytimg.com/vi/847zrERIr-k/default.jpg)
![Day 20: Open NLLB - analyzing HBS data (Bosnian, Croatian) (Pt 2)](https://i.ytimg.com/vi/aPLzxbk-TsE/default.jpg)
![Day 7: Meta NLLB - data structure & filtering (Pt. 1)](https://i.ytimg.com/vi/LxNVMuFSzpM/default.jpg)
![Hamel Husain - Building LLM Apps in Production](https://i.ytimg.com/vi/MFSd-_pMExI/default.jpg)
![Day 28: Open NLLB - debugging fuzzy dedup, training fasttext LID (Pt 3)](https://i.ytimg.com/vi/ee57AeGkEeU/default.jpg)
![Day 10: Open NLLB - evaluation data, filtering (Pt 3.)](https://i.ytimg.com/vi/pwS-5BKgNkk/default.jpg)
![Day 7: Meta NLLB - analyzing the training script, Jais paper (Pt. 4)](https://i.ytimg.com/vi/CxJNuOeXTVk/default.jpg)
![EleutherAI Pythia w/ Hailey Schoelkopf](https://i.ytimg.com/vi/2cQWLyV2RK0/default.jpg)
![Day 18: Open NLLB - data loading document, GitHub tasks (Pt 1 cont.)](https://i.ytimg.com/vi/W9aWQEn2-Ww/default.jpg)
![DeepMind's TacticAI: an AI assistant for football tactics | Petar Veličković](https://i.ytimg.com/vi/BRbUikWdXhI/default.jpg)
![Jarvis for Images! (demo) - run locally, no external APIs](https://i.ytimg.com/vi/zaAhkIV6dmw/default.jpg)