Introduction to Support Vector Machine SVM Functional and Geometric Margin || Lesson 79 ||
#machinelearning#learningmonkey
In this class, we discuss Introduction to Support Vector Machine SVM Functional and Geometric Margin.
Introduction to Support Vector Machine SVM Functional and Geometric Margin understanding requires some basic geometry.
Let's discuss the geometry first.
Take a line in a two-dimensional coordinate system.
Take a few points in the coordinate system and substitute it in the equation.
The point away from the line we get large value.
The point near to line we get small value.
On one side we get positive values and on the other side, we get negative values.
A point on the line we get zero.
This value we call it gamma hat which is the functional margin.
Problem with functional margin is?
The line equation multiply by some constant. we get another equation.
There are many equations that satisfy the same line.
The functional margin is different for different equations.
The functional margin is a scaled variant.
We can increase or decrease the value of functional margin by scaling.
We need an equation scale-invariant.
The distance of a point from the line is constant whatever equation we consider.
This we call it geometric margin.
This is equal to the functional margin divide by norm w.
Here in support vector machine assumption is data is linearly separable.
The support vector machine can be applied only to binary classification.
The goal of the support vector machine is to maximize the geometric margin.
Link for playlists:
https://www.youtube.com/channel/UCl8x4Pn9Mnh_C1fue-Yndig/playlists
Link for our website: https://learningmonkey.in
Follow us on Facebook @ https://www.facebook.com/learningmonkey
Follow us on Instagram @ https://www.instagram.com/learningmonkey1/
Follow us on Twitter @ https://twitter.com/_learningmonkey
Mail us @ learningmonkey01@gmail.com
Видео Introduction to Support Vector Machine SVM Functional and Geometric Margin || Lesson 79 || канала Learning Monkey
In this class, we discuss Introduction to Support Vector Machine SVM Functional and Geometric Margin.
Introduction to Support Vector Machine SVM Functional and Geometric Margin understanding requires some basic geometry.
Let's discuss the geometry first.
Take a line in a two-dimensional coordinate system.
Take a few points in the coordinate system and substitute it in the equation.
The point away from the line we get large value.
The point near to line we get small value.
On one side we get positive values and on the other side, we get negative values.
A point on the line we get zero.
This value we call it gamma hat which is the functional margin.
Problem with functional margin is?
The line equation multiply by some constant. we get another equation.
There are many equations that satisfy the same line.
The functional margin is different for different equations.
The functional margin is a scaled variant.
We can increase or decrease the value of functional margin by scaling.
We need an equation scale-invariant.
The distance of a point from the line is constant whatever equation we consider.
This we call it geometric margin.
This is equal to the functional margin divide by norm w.
Here in support vector machine assumption is data is linearly separable.
The support vector machine can be applied only to binary classification.
The goal of the support vector machine is to maximize the geometric margin.
Link for playlists:
https://www.youtube.com/channel/UCl8x4Pn9Mnh_C1fue-Yndig/playlists
Link for our website: https://learningmonkey.in
Follow us on Facebook @ https://www.facebook.com/learningmonkey
Follow us on Instagram @ https://www.instagram.com/learningmonkey1/
Follow us on Twitter @ https://twitter.com/_learningmonkey
Mail us @ learningmonkey01@gmail.com
Видео Introduction to Support Vector Machine SVM Functional and Geometric Margin || Lesson 79 || канала Learning Monkey
Показать
Комментарии отсутствуют
Информация о видео
Другие видео канала
![Non Preemptive Priority Scheduling Algorithm || Lesson 23 || Operating Systems || Learning Monkey ||](https://i.ytimg.com/vi/465mbnVDViU/default.jpg)
![Average Speed Examples Time and Distance || Lesson 1.5 || Aptitude & Reasoning || Leaerning Monkey](https://i.ytimg.com/vi/h9E-CtYk82U/default.jpg)
![LR(0) vs SLR(1) in Compiler || Lesson 32 || Compiler Design || Learning Monkey ||](https://i.ytimg.com/vi/TquHkb1Dp14/default.jpg)
![Quartiles Given Frequency Distributions || Lesson 18 || Probability & Statistics || Learning Monkey](https://i.ytimg.com/vi/gw7YrRXYuOU/default.jpg)
![Relative Speed Trains Formulas || Lesson 2.1 || Aptitude & Reasoning || Learning Monkey ||](https://i.ytimg.com/vi/4_UdCvtWCyI/default.jpg)
![DF Don't Fragment Field in the IPv4 Header || Lesson 73 || Computer Networks || Learning Monkey ||](https://i.ytimg.com/vi/OxoBLN5dQFM/default.jpg)
![Relative Speed Train Problems || Lesson 2.3 || Aptitude & Reasoning || Learning Monkey ||](https://i.ytimg.com/vi/qUfihdvOuR0/default.jpg)
![Median || Lesson 8 || Probability & Statistics || Learning Monkey ||](https://i.ytimg.com/vi/RGkOY83ytdk/default.jpg)
![Front and Back End in Compiler || Lesson 5 || Computer Networks || Learning Monkey ||](https://i.ytimg.com/vi/hO6RX3w0foc/default.jpg)
![DFA Example on Counting 1 || Lesson 10 || Finite Automata|| Learning Monkey ||](https://i.ytimg.com/vi/Jsc2LdbYbiM/default.jpg)
![Round Robin Scheduling Algorithm || Lesson 22 || Operating Systems || Learning Monkey ||](https://i.ytimg.com/vi/WeoO-wkhgJs/default.jpg)
![LCM Application Examples || Lesson 6.9 || Aptitude & Reasonng || Learning Monkey ||](https://i.ytimg.com/vi/Fan235Z6IsU/default.jpg)
![Example of 3 Dimensional Arrays || Lesson 49.3 || C Programming || Learning Monkey ||](https://i.ytimg.com/vi/9EmPtXbYpC0/default.jpg)
![Local and Instance Variables in Java || Lesson 33 || Java Programming || Learning Monkey ||](https://i.ytimg.com/vi/IJbCxivJWm0/default.jpg)
![Working Mechanism for Finding the Maximum and Minimum using Divide and Conquer Technique | Lesson 18](https://i.ytimg.com/vi/2knw656wJyQ/default.jpg)
![Probability Examples 4 || Lesson 38 || Probability & Statistics || Learning Monkey ||](https://i.ytimg.com/vi/08gRjETMfIo/default.jpg)
![How to Create Save and Run Java Program in Eclipse || Lesson 1 || Java Programming ||](https://i.ytimg.com/vi/rVwc1N4sOHE/default.jpg)
![Variance and Game Favorable Example on Probability Distribution |Lesson 50 |Probability & Statistics](https://i.ytimg.com/vi/VAN7wHXqVsc/default.jpg)
![Remove Element from an Array || Lesson 13 || C Placements || Learning Monkey ||](https://i.ytimg.com/vi/3B1ANgm_p5E/default.jpg)
![Sample Store Data Set || Lesson 1.10 || Python for Data Science || Learning Monkey ||](https://i.ytimg.com/vi/_HP-58lz49I/default.jpg)
![Turing Machine as Subtractor || Lesson 88 || Finite Automata || Learning Monkey ||](https://i.ytimg.com/vi/mqNeu95edlk/default.jpg)