The Mathematics of Neural Networks
This video uses a spatial analogy to explore why deep neural networks are more powerful than shallow ones. This is part 4 in my deep learning series: https://www.youtube.com/playlist?list=PLbg3ZX2pWlgKV8K6bFJr5dhM7oOClExUJ We'll explore what neurons are doing individually and as a group to "understand" perceptions. It leads us to the Manifold Hypothesis.
Видео The Mathematics of Neural Networks канала Art of the Problem
Видео The Mathematics of Neural Networks канала Art of the Problem
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
But what is a neural network? | Chapter 1, Deep learningTuring machines explained visuallyHow Recommender Systems Work (Netflix/Amazon)The Beauty of Lempel-Ziv CompressionSecret Sharing Explained VisuallyThe AI Hardware ProblemDo Neural Networks Think Like Our Brain? OpenAI Answers! 🧠Math Has a Fatal FlawP vs. NP - The Biggest Unsolved Problem in Computer ScienceStealing Neural Networks With Model Extraction AttacksMathematics is the queen of SciencesMachine Learning Zero to Hero (Google I/O'19)Should Computers Run the World? - with Hannah FryAre Neurons Just Electric Circuits?What is Logic?Gradient descent, how neural networks learn | Chapter 2, Deep learningExtraordinary Conics: The Most Difficult Math Problem I Ever SolvedDiffie Hellman -the Mathematics bit- ComputerphileWhat math and science cannot (yet?) explain