LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Acceleration for proximal stochastic dual coordinate ascent algorithm in solving regularised loss minimisation with ℓ 2 norm

Photo by benjaminzanatta from unsplash

An accelerated version of the proximal stochastic dual coordinate ascent (SDCA) algorithm in solving regularised loss minimisation with l 2 norm is presented, wherein a momentum is introduced and the… Click to show full abstract

An accelerated version of the proximal stochastic dual coordinate ascent (SDCA) algorithm in solving regularised loss minimisation with l 2 norm is presented, wherein a momentum is introduced and the strong theoretical guarantees of SDCA are shared. Moreover, it is also suitable for various key machine learning optimisation problems including support vector machine (SVM), multiclass SVM, logistic regression, and ridge regression. In particular, the Nestrov's estimate sequence technique to adjust the weight coefficient dynamically and conveniently is adopted. It is applied for training linear SVM from the large training dataset. Experimental results show that the proposed method has a competitive classification performance and faster convergence speed than state-of-the-art algorithms.

Keywords: solving regularised; proximal stochastic; coordinate ascent; dual coordinate; algorithm solving; stochastic dual

Journal Title: Electronics Letters
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.