Evolution of neural network|Machine Learning| SNS institutions
#snsinstitutions #snsdesignthinkers #designthinking in this video describes about
Hebbian learning deals with neural plasticity.
Hebbian learning is unsupervised and deals with long-term potentiation.
Hebbian learning deals with pattern recognition and exclusive-or circuits deal with if-
then rules.
Backpropagation solved the exclusive-or issue that Hebbian learning could not
handle.
This also allowed for multi-layer networks to be feasible and efficient.
If an error was found, the error was solved at each layer by modifying the weights at
each node.
This led to the development of support vector machines, linear classifiers, and max-
pooling. The vanishing gradient problem affects feedforward networks that use back
propagation and recurrent neural network.
This is known as deep-learning.
Hardware-based designs are used for biophysical simulation and neurotrophic
computing. They have large scale component analysis and convolution creates new
class of neural computing with analog.
This also solved back-propagation for many-layered feedforward neural networks.
Convolutional networks are used for alternating between convolutional layers and
max-pooling layers with connected layers (fully or sparsely connected) with a final
classification layer.
The learning is done without unsupervised pre-training. Each filter is equivalent to a
weights vector that has to be trained.
The shift variance has to be guaranteed to dealing with small and large neural
networks. This is being resolved in Development Networks.
Some of the other learning techniques involve error-correction learning, memory-
based learning and competitive learning.
Видео Evolution of neural network|Machine Learning| SNS institutions канала S.Saranya SNS
Hebbian learning deals with neural plasticity.
Hebbian learning is unsupervised and deals with long-term potentiation.
Hebbian learning deals with pattern recognition and exclusive-or circuits deal with if-
then rules.
Backpropagation solved the exclusive-or issue that Hebbian learning could not
handle.
This also allowed for multi-layer networks to be feasible and efficient.
If an error was found, the error was solved at each layer by modifying the weights at
each node.
This led to the development of support vector machines, linear classifiers, and max-
pooling. The vanishing gradient problem affects feedforward networks that use back
propagation and recurrent neural network.
This is known as deep-learning.
Hardware-based designs are used for biophysical simulation and neurotrophic
computing. They have large scale component analysis and convolution creates new
class of neural computing with analog.
This also solved back-propagation for many-layered feedforward neural networks.
Convolutional networks are used for alternating between convolutional layers and
max-pooling layers with connected layers (fully or sparsely connected) with a final
classification layer.
The learning is done without unsupervised pre-training. Each filter is equivalent to a
weights vector that has to be trained.
The shift variance has to be guaranteed to dealing with small and large neural
networks. This is being resolved in Development Networks.
Some of the other learning techniques involve error-correction learning, memory-
based learning and competitive learning.
Видео Evolution of neural network|Machine Learning| SNS institutions канала S.Saranya SNS
Комментарии отсутствуют
Информация о видео
21 мая 2025 г. 23:38:05
00:05:50
Другие видео канала