A Survey On Backpropagation Algorithms For Feedforward Neural Networks
- Kuldip Vora
- Shruti Yagnik
Artificial neural network(ANN); Backpropagation algorithm(BPA); Mean square error (MSE), Multilayer
feedforward neural network(MLFFNN); Classification; Cost Function; Genetic algorithm(GA)
The Back-propagation (BP) training algorithm is a renowned representative of all iterative gradient descent algorithms used for supervised learning in neural networks. It is designed to minimize the mean square error (MSE) between the actual output of a multilayer feed-forward neural network and the desired output. BP has a great high merit of simplicity on implementation and calculation compared to other mathematically complex techniques. It is its simplicity that over period of time attracts researchers and so that, many improvements and variations of the BP learning algorithm have been reported to beat its limitations of slow convergence rate and convergence to the local minima. It is applied to a wide range of practical problems and has successfully demonstrated its power. This paper summarize the basic BP and gradual improvements over Back propagation technique used for classification in Artificial neural networks(ANN) and comparisons with new methods like genetic algorithms(GA) and showing why it is still effective and has scope to improvements.
Kuldip Vora, Shruti Yagnik. "A Survey On Backpropagation Algorithms For Feedforward Neural Networks".INTERNATIONAL JOURNAL OF ENGINEERING DEVELOPMENT AND RESEARCH ISSN:2321-9939, Vol.1, Issue 3, pp.193 - 197, URL :https://rjwave.org/ijedr/papers/IJEDR1303040.pdf
Volume 1 Issue 3
Pages. 193 - 197