LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Accelerated Linearly Convergent Stochastic L-BFGS Algorithm

Photo by wesleyphotography from unsplash

The limited memory version of the Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) algorithm is the most popular quasi-Newton algorithm in machine learning and optimization. Recently, it was shown that the stochastic L-BFGS (sL-BFGS) algorithm… Click to show full abstract

The limited memory version of the Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) algorithm is the most popular quasi-Newton algorithm in machine learning and optimization. Recently, it was shown that the stochastic L-BFGS (sL-BFGS) algorithm with the variance-reduced stochastic gradient converges linearly. In this paper, we propose a new sL-BFGS algorithm by importing a proper momentum. We prove an accelerated linear convergence rate under mild conditions. The experimental results on different data sets also verify this acceleration advantage.

Keywords: stochastic bfgs; accelerated linearly; bfgs; bfgs algorithm; linearly convergent

Journal Title: IEEE Transactions on Neural Networks and Learning Systems
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.