Artificial neural networks (ANNs) are typically highly nonlinear systems which finely tuned via the optimization of their associated, nonconvex loss functions. In many cases, gradient any such function has superlinear growth, making use widely accepted (stochastic) descent methods, based on Euler numerical schemes, problematic. We offer a new learning algorithm an appropriately constructed vari...