The limited memory version of the Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) algorithm is the most popular quasi-Newton algorithm in machine learning and optimization. Recently, it was shown that the stochastic L-BFGS (sL-BFGS) algorithm… Click to show full abstract
The limited memory version of the Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) algorithm is the most popular quasi-Newton algorithm in machine learning and optimization. Recently, it was shown that the stochastic L-BFGS (sL-BFGS) algorithm with the variance-reduced stochastic gradient converges linearly. In this paper, we propose a new sL-BFGS algorithm by importing a proper momentum. We prove an accelerated linear convergence rate under mild conditions. The experimental results on different data sets also verify this acceleration advantage.
               
Click one of the above tabs to view related content.