IJSTR

International Journal of Scientific & Technology Research

IJSTR@Facebook IJSTR@Twitter IJSTR@Linkedin
Home About Us Scope Editorial Board Blog/Latest News Contact Us
CALL FOR PAPERS
AUTHORS
DOWNLOADS
CONTACT
QR CODE
IJSTR-QR Code

IJSTR >> Volume 4 - Issue 2, February 2015 Edition



International Journal of Scientific & Technology Research  
International Journal of Scientific & Technology Research

Website: http://www.ijstr.org

ISSN 2277-8616



An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.

[Full Text]

 

AUTHOR(S)

Mohammad Mahmudul Alam Mia, Shovasis Kumar Biswas, Monalisa Chowdhury Urmi, Abubakar Siddique

 

KEYWORDS

Index Terms: Back propagation , Epoch, Multi layer perceptron , Neural network training.

 

ABSTRACT

Abstract: Recently, back propagation neural network (BPNN) has been applied successfully in many areas with excellent generalization results, for example, rule extraction, classification and evaluation. In this paper the Levenberg-Marquardt back-propagation algorithm is used for training the network and reconstructs the image. It is found that Marquardt algorithm is significantly more proficient. A practical problem with MLPs is to select the correct complexity for the model, i.e., the right number of hidden units or correct regularization parameters. In this paper, a study is made to determine the issue of number of neurons in every hidden layer and the quantity of hidden layers needed for getting the high accuracy. We performed regression R analysis to measure the correlation between outputs and targets.

 

REFERENCES

[1] Zhao, Zhizhong, et al. "Application and comparison of BP neural network algorithm in MATLAB." Measuring Technology and Mechatronics Automation (ICMTMA), 2010 International Conference on. Vol. 1. IEEE, 2010.

[2] Alsmadi M. K. S., K. Omar, and S. A. Noah (2009), “Back Propagation Algorithm: The Best Algorithm among the Multi-layer Perceptron Algorithm”, International Journal of Computer Science and Network Security, pp. 378-383.

[3] Svozil, Daniel, Vladimir Kvasnicka, and Jir̂í Pospichal. "Introduction to multi-layer feed-forward neural networks." Chemometrics and intelligent laboratory systems 39.1 (1997): 43-62.

[4] Pinjare, S. L., and Arun Kumar. "Implementation of neural network back propagation training algorithm on FPGA." Int. J. Comput. Appl 52.6 (2012): 0975-8887.

[5] Mutasem khalil Sari Alsmadi, Khairuddin Bin Omar and Shahrul Azman Noah (2009), “Back Propagation Algorithm: The Best Algorithm among the Multi-layer Perceptron Algorithm”, International Journal of Computer Science and Network Security, Vol.9, No.4, pp.378-383.

[6] Norhamreeza Abdul Hamid and Nazir Mohd Nawi (2011), “Accelerating Learning Performance of Back Propagation Algorithm by Using Adaptive Gain Together with Adaptive Momentum and Adaptive Learning Rate on Classification Problems”, International Journal of Software Engineering and its Applications, Vol.5, No.4, pp.31-44.

[7] Alsmadi M. K. S., K. Omar, and S. A. Noah (2009), “Back Propagation Algorithm: The Best Algorithm among the Multi-layer Perceptron Algorithm”, International Journal of Computer Science and Network Security, pp. 378-383.

[8] Haykin, Simon. Neural Networks and Learning Machines, New Jersey: Pearson Prentice Hall, 2008.

[9] Pinjare, S. L., and Arun Kumar. "Implementation of neural network back propagation training algorithm on FPGA." Int. J. Comput. Appl 52.6 (2012): 0975-8887.

[10] Sapna, S., A. Tamilarasi, and M. Pravin Kumar. "Backpropagation learning algorithm based on Levenberg Marquardt Algorithm." CS & IT-CSCP 2012 (2012): 393-398.

[11] Haykin, Simon. Neural Networks and Learning Machines, New Jersey: Pearson Prentice Hall, 2008.