Multilayer Perceptron
History
Laws of Association
Contiguity
Frequency
Similarity
Contrast
Biological Neuron
Cell Body
Dendrites
Axon
Neural Impulse
Terminal Branches of Axon
Myelin Sheath
Authors
McCulloch and Pitts, 1943
Frank Rosenblatt
Activation Function
Sigmoid
Hyperbolic
Rational
Gradient Descent
Gradient descent is a first-order iterative optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient (or of the approximate gradient) of the function at the current point. If instead one takes steps proportional to the positive of the gradient, one approaches a local maximum of that function; the procedure is then known as gradient ascent. Wikipedia
Training Neuron
With Mathematics
With Code
Training Neural Network
The backward propagation of errors or backpropagation, is a common method of training artificial neural networks and used in conjunction with an optimization method such as gradient descent. Wikipedia
With Mathematics
With Code
Radial Basis Function
Last updated