Backpropagation (Part 4): The Sigmoid Transfer Function and Its Derivative
A differentiable transfer function, such as the sigmoid (logistic) function, is essential for the backpropagation training method for neural networks such as the Multilayer Perceptron (MLP). After re-acquainting ourselves with the chain rule from differential calculus (in vids 3a and 3b), we now apply the chain rule to taking the derivative of the transfer function with respect to the node inputs into that function.
Видео Backpropagation (Part 4): The Sigmoid Transfer Function and Its Derivative канала Alianna J. Maren
Видео Backpropagation (Part 4): The Sigmoid Transfer Function and Its Derivative канала Alianna J. Maren
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
Backpropagation (Part 2): Mathematical Dependency and Creating the Word ProblemThree Keys to Crafting a Compelling Research Paper: Series IntroBackpropagation (Part 3a): The Chain RuleSelecting References for Your Research Paper: Chicago Style Author-Date FormattingStructuring Your Research Paper Part 1Writing Your Research Paper Problem StatementCrafting a Compelling Research Paper - Pet Peeve: Significant FiguresThe New Good Stuff is at Themesis!Career-Boosting Reading List: Part 1 - Sun Tzu's "The Art of War"Crafting Your Research Paper Title PageBooks 2 and 3: Great Get-Life-Together Reading!NLP: Clustering vs. ClassificationSummed-Squared-Error (SSE): Neural Networks Back-Propagation X-OR ProblemChristmas in Kaua'i - Mai Tais and AI: Part 1Writing an Effective Research Paper: Using Storytelling and PicturesWriting Your Research Paper Abstract to Get Reader's Attention - Part 1Backpropagation (Part 3b): The Chain Rule (with Specific Application to the Transfer Function)Math Bloopers! Dr. AJ's First Candid Blooper RevealCrafting a Compelling Research Paper - Pet Peeve: Figure LegibilitiyAI and Society: COVID-19 Accelerates AI and Robotics Use