Comparison of various optimizers and future work:-10 Aug 2018
In this blog post, I will be comparing all the optimizers on the same dataset that is used for performing classification using TMVA.
The above figures show the convergence of the training and testing erros of various optimizers during the integration tests ( methodDL tests ).
1) Implement other optimizers like Adamax, Nadam and Nesterov accelerated SGD optimizers.
2) Add Weight Decay of learning rate implementation to optimizers.
3) Benchmark the individual optimizers on separate datasets with tensorflow.