IJSTR

International Journal of Scientific & Technology Research

Home About Us Scope Editorial Board Blog/Latest News Contact Us
0.2
2019CiteScore
 
10th percentile
Powered by  Scopus
Scopus coverage:
Nov 2018 to May 2020

CALL FOR PAPERS
AUTHORS
DOWNLOADS
CONTACT

IJSTR >> Volume 9 - Issue 6, June 2020 Edition



International Journal of Scientific & Technology Research  
International Journal of Scientific & Technology Research

Website: http://www.ijstr.org

ISSN 2277-8616



Significance Of Epochs On Training A Neural Network

[Full Text]

 

AUTHOR(S)

Saahil Afaq, Dr. Smitha Rao

 

KEYWORDS

Hyperparameters, epochs, nodes, hidden layers, Deep Neural Network (DNN), Architecture.

 

ABSTRACT

Deep neural network (DNN), has shown an incredible success in the field of computer vision and in tasks such as classification, facial detection etc. But, accuracy of a model depends on a large number of parameters such as weights, bias, number of hidden layers, different kinds of activation function and hyperparameters. Epochs is a form of hyperparameter which plays an integral part in the training process of a model. The total number of epochs to be used help us decide whether the data is over trained or not. Recently, the performance of deep neural networks, have been improved by making use of pre-trained network architectures, and by the introduction of GPU-based computation and now recently we are even on the verge of training the models on TPU chips. However, there are many problems in the field of Deep Neural Network which concerns the training, back propagation, and customizing of the hyperparameters.

 

REFERENCES

[1] Sharma, O. (2019). Deep Challenges Associated with Deep Learning. 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon)
[2] Goodfellow, I., Bengio, Y., Courville, A. and Bengio, Y., 2016. Deep learning (Vol. 1). Cambridge: MIT press
[3] Sinha, S., Singh, T. N., Singh, V. K., & Verma, A. K. (2009). Epoch determination for neural network by self-organized map (SOM). Computational Geosciences, 14(1)
[4] Kanada, Y. (2016). Optimizing neural-network learning rate by using a genetic algorithm with per-epoch mutations. 2016 International Joint Conference on Neural Networks (IJCNN).
[5] Roberts, T., & Paliwal, K. K. (2019). Time-Scale Modification Using Fuzzy Epoch-Synchronous Overlap-Add (FESOLA). 2019 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics (WASPAA).
[6] ELDEM, A., ELDEM, H., & USTUN, D. (2018). A Model of Deep Neural Network for Iris Classification with Different Activation Functions. 2018 International Conference on Artificial Intelligence and Data Processing
[7] Dhande, G., & Shaikh, Z. (2019). Analysis of Epochs in Environment based Neural Networks Speech Recognition System. 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI).
[8] Yaseen, M. U., Anjum, A., Rana, O., & Antonopoulos, N. (2018). Deep Learning Hyper-Parameter Optimization for Video Analytics in Clouds. IEEE Transactions on Systems, Man, and Cybernetics: Systems
[9] Wu, X., & Liu, J. (2009). A New Early Stopping Algorithm for Improving Neural Network Generalization. 2009 Second International Conference on Intelligent Computation Technology and Automation. doi:10.1109/icicta.2009.11
Shaheen, F., & Verma, B. (2016). An ensemble of deep learning architectures for automatic feature extraction. 2016 IEEE Symposium Series on Computational Intelligence (SSCI)