LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Accelerated Maximally Split ADMM for a Class of Generalized Ridge Regression.

Photo by robertbye from unsplash

Ridge regression (RR) has been commonly used in machine learning, but is facing computational challenges in big data applications. To meet the challenges, this article develops a highly parallel new… Click to show full abstract

Ridge regression (RR) has been commonly used in machine learning, but is facing computational challenges in big data applications. To meet the challenges, this article develops a highly parallel new algorithm, i.e., an accelerated maximally split alternating direction method of multipliers (A-MS-ADMM), for a class of generalized RR (GRR) that allows different regularization factors for different regression coefficients. Linear convergence of the new algorithm along with its convergence ratio is established. Optimal parameters of the algorithm for the GRR with a particular set of regularization factors are derived, and a selection scheme of the algorithm parameters for the GRR with general regularization factors is also discussed. The new algorithm is then applied in the training of single-layer feedforward neural networks. Experimental results on performance validation on real-world benchmark datasets for regression and classification and comparisons with existing methods demonstrate the fast convergence, low computational complexity, and high parallelism of the new algorithm.

Keywords: regression; new algorithm; admm class; accelerated maximally; maximally split; ridge regression

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.