pub fn adam<F>(learning_rate: F) -> NeuralOptimizer<F>
where F: Float + ScalarOperand + 'static + Send + Sync,
Expand description

Create Adam optimizer with default settings for neural networks