LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Efficient Gradient Support Pursuit With Less Hard Thresholding for Cardinality-Constrained Learning.

Photo from wikipedia

Recently, stochastic hard thresholding (HT) optimization methods [e.g., stochastic variance reduced gradient hard thresholding (SVRGHT)] are becoming more attractive for solving large-scale sparsity/rank-constrained problems. However, they have much higher HT… Click to show full abstract

Recently, stochastic hard thresholding (HT) optimization methods [e.g., stochastic variance reduced gradient hard thresholding (SVRGHT)] are becoming more attractive for solving large-scale sparsity/rank-constrained problems. However, they have much higher HT oracle complexities, especially for high-dimensional data or large-scale matrices. To address this issue and inspired by the well-known Gradient Support Pursuit (GraSP) method, this article proposes a new Relaxed Gradient Support Pursuit (RGraSP) framework. Unlike GraSP, RGraSP only requires to yield an approximation solution at each iteration. Based on the property of RGraSP, we also present an efficient stochastic variance reduction-gradient support pursuit algorithm and its fast version (called stochastic variance reduced gradient support pursuit (SVRGSP+). We prove that the gradient oracle complexity of both our algorithms is two times less than that of SVRGHT. In particular, their HT complexity is about Ks times less than that of SVRGHT, where Ks is the restricted condition number. Moreover, we prove that our algorithms enjoy fast linear convergence to an approximately global optimum, and also present an asynchronous parallel variant to deal with very high-dimensional and sparse data. Experimental results on both synthetic and real-world datasets show that our algorithms yield superior results than the state-of-the-art gradient HT methods.

Keywords: gradient support; hard thresholding; support pursuit; stochastic variance

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.