for training neural networks
Adam - Adaptive learning rate optimization algorithm for training neural networks
SGD - Stochastic Gradient Descent optimization algorithm for training neural network