A. Gupta, R. Gross, et al.
SPIE Advances in Semiconductors and Superconductors 1990
We introduce NC-SARAH for non-convex optimization as a practical modified version of the original SARAH algorithm that was developed for convex optimization. NC-SARAH is the first to achieve two crucial performance properties at the same time—allowing flexible minibatch sizes and large step sizes to achieve fast convergence in practice as verified by experiments. NC-SARAH has a close to optimal asymptotic convergence rate equal to existing prior variants of SARAH called SPIDER and SpiderBoost that either use an order of magnitude smaller step size or a fixed minibatch size. For convex optimization, we propose SARAH++ with sublinear convergence for general convex and linear convergence for strongly convex problems; and we provide a practical version for which numerical experiments on various datasets show an improved performance.
A. Gupta, R. Gross, et al.
SPIE Advances in Semiconductors and Superconductors 1990
J.P. Locquet, J. Perret, et al.
SPIE Optical Science, Engineering, and Instrumentation 1998
Satoshi Hada
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Leo Liberti, James Ostrowski
Journal of Global Optimization