TY - JOUR AU - Fatima, Noor PY - 2020 SN - 2255-2863 UR - http://hdl.handle.net/10366/146093 AB - Adopting the most suitable optimization algorithm (optimizer) for a Neural Network Model is among the most important ventures in Deep Learning and all classes of Neural Networks. It's a case of trial and error experimentation. In this paper, we will... PB - Ediciones Universidad de Salamanca (EspaƱa) KW - Adadelta KW - Adagrad KW - Adam KW - Adamax KW - Deep Learning KW - Neural Networks KW - Nadam KW - Optimization algorithms KW - RMSprop KW - SGD KW - Adadelta KW - Adagrad KW - Adam KW - Adamax KW - Deep Learning KW - Neural Networks KW - Nadam KW - Optimization algorithms KW - RMSprop KW - SGD TI - Enhancing Performance of a Deep Neural Network: A Comparative Analysis of Optimization Algorithms ER -