Compartir
Título
Enhancing Performance of a Deep Neural Network: A Comparative Analysis of Optimization Algorithms
Autor(es)
Materia
Adadelta
Adagrad
Adam
Adamax
Deep Learning
Neural Networks
Nadam
Optimization algorithms
RMSprop
SGD
Adadelta
Adagrad
Adam
Adamax
Deep Learning
Neural Networks
Nadam
Optimization algorithms
RMSprop
SGD
Fecha de publicación
2020-06-20
Editor
Ediciones Universidad de Salamanca (España)
Citación
ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal, 9 (2020)
Resumen
Adopting the most suitable optimization algorithm (optimizer) for a Neural Network Model is among the most important ventures in Deep Learning and all classes of Neural Networks. It's a case of trial and error experimentation. In this paper, we will experiment with seven of the most popular optimization algorithms namely: sgd, rmsprop, adagrad, adadelta, adam, adamax and nadam on four unrelated datasets discretely, to conclude which one dispenses the best accuracy, efficiency and performance to our deep neural network. This work will provide insightful analysis to a data scientist in choosing the best optimizer while modelling their deep neural network.
URI
ISSN
2255-2863
Colecciones