Nick Bostrom: Superintelligence | AI Podcast Clips
Full episode with Nick Bostrom (Mar 2020): https://www.youtube.com/watch?v=rfKiTGj-zeQ
Clips channel (Lex Clips): https://www.youtube.com/lexclips
Main channel (Lex Fridman): https://www.youtube.com/lexfridman
(more links below)
Podcast full episodes playlist:
https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
Podcasts clips playlist:
https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41
Podcast website:
https://lexfridman.com/ai
Podcast on Apple Podcasts (iTunes):
https://apple.co/2lwqZIr
Podcast on Spotify:
https://spoti.fi/2nEwCF8
Podcast RSS:
https://lexfridman.com/category/ai/feed/
Nick Bostrom is a philosopher at University of Oxford and the director of the Future of Humanity Institute. He has worked on fascinating and important ideas in existential risks, simulation hypothesis, human enhancement ethics, and the risks of superintelligent AI systems, including in his book Superintelligence. I can see talking to Nick multiple times on this podcast, many hours each time, but we have to start somewhere.
Subscribe to this YouTube channel or connect on:
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
- Medium: https://medium.com/@lexfridman
- Support on Patreon: https://www.patreon.com/lexfridman
Видео Nick Bostrom: Superintelligence | AI Podcast Clips канала Lex Fridman
Clips channel (Lex Clips): https://www.youtube.com/lexclips
Main channel (Lex Fridman): https://www.youtube.com/lexfridman
(more links below)
Podcast full episodes playlist:
https://www.youtube.com/playlist?list=PLrAXtmErZgOdP_8GztsuKi9nrraNbKKp4
Podcasts clips playlist:
https://www.youtube.com/playlist?list=PLrAXtmErZgOeciFP3CBCIEElOJeitOr41
Podcast website:
https://lexfridman.com/ai
Podcast on Apple Podcasts (iTunes):
https://apple.co/2lwqZIr
Podcast on Spotify:
https://spoti.fi/2nEwCF8
Podcast RSS:
https://lexfridman.com/category/ai/feed/
Nick Bostrom is a philosopher at University of Oxford and the director of the Future of Humanity Institute. He has worked on fascinating and important ideas in existential risks, simulation hypothesis, human enhancement ethics, and the risks of superintelligent AI systems, including in his book Superintelligence. I can see talking to Nick multiple times on this podcast, many hours each time, but we have to start somewhere.
Subscribe to this YouTube channel or connect on:
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman
- Medium: https://medium.com/@lexfridman
- Support on Patreon: https://www.patreon.com/lexfridman
Видео Nick Bostrom: Superintelligence | AI Podcast Clips канала Lex Fridman
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Nick Bostrom: Simulation and Superintelligence | Lex Fridman Podcast #83What happens when our computers get smarter than we are? | Nick BostromSuperintelligence: Science or Fiction? | Elon Musk & Other Great MindsThe Emptiness of Existence by Arthur SchopenhauerGSP teaches Lex Fridman how to street fightSuperintelligence | Nick Bostrom | Talks at GoogleWhat is Wolfram Language? (Stephen Wolfram) | AI Podcast ClipsHow civilization could destroy itself -- and 4 ways we could prevent it | Nick BostromAndrew Ng: Advice on Getting Started in Deep Learning | AI Podcast ClipsRoger Penrose: Infinite Cycles of the Universe Punctuated by Big Bang SingularitiesFrom Artificial Intelligence to Superintelligence: Nick Bostrom on AI & The Future of HumanityCellular Automata and Rule 30 (Stephen Wolfram) | AI Podcast ClipsGeorge Hotz: Comma.ai, OpenPilot, and Autonomous Vehicles | Lex Fridman Podcast #31How Far is Too Far? | The Age of A.I.Stuart Russell: Long-Term Future of Artificial Intelligence | Lex Fridman Podcast #9Nick Bostrom on the Joe Rogan Podcast Conversation About the Simulation | AI Podcast ClipsNick Bostrom - XRisk - Superintelligence, Human Enhancement & the Future of Humanity InstituteThe Future of Machine Intelligence - Nick Bostrom, at USIPhilosopher Nick Bostrom talks about the existential risks faced by Humanity