LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Edge-based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization

Photo from wikipedia

Abstract In this patentan edge-based stochastic proximal gradient algorithm is developed to solve decentralized composite optimization problems involving a common non-smooth regulariza tion term over an undirected and connected network.… Click to show full abstract

Abstract In this patentan edge-based stochastic proximal gradient algorithm is developed to solve decentralized composite optimization problems involving a common non-smooth regulariza tion term over an undirected and connected network. Especially, when the decentralized composite optimization problems are featured by high dimension and large scale frequently found in machine learning and resource allocation, we further set the local cost function to the average of a moderate amount of local cost subfunctions. The algorithm set forth in the present invention utilizes an unbiased stochastic averaging gradient obtained by one randomly selected local cost subfunction to approximate the true local gradients at each it eration, which highly reduces the cost of gradient evaluation and computational complexity. Thus, the algorithm has advantages to deal with high-dimension and large-scale optimization problems. Algorithm 1: Edge-Based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization 1: Initialization: Let'a E R", yi= xi,o for i E lV, and Aig= for j€P Then initialize the tables of the local gradients with Vs4 (yt 2: Fork = 0,1,2... do. 3: Select t uniformly at random. 4: Compute stochastic averaging gradients as: ga Vsk (xik) - Vs" (y )+ s h=1 5: Take y =ik andstore Vsk (y k ) Vsik(T9)inthe correspond tables of the local gradients. All other entries in the table stay unchanged. 6: Update variable zik as zik+1 = 2!ik - a(gi,k + ZLij(i,k - j k) j=1 jeNj +Z /UAt,ksgnQj-0 + 'Z VN Aijsksgn(j- i)); jGPj i,kfi =argmin(R(z)+ 1j-Z -1j2> where sgn(.) denotes the sign function. 7: Update variable k, j E Pi as: Aijskl ;= Aijk + + (ik+1 -1j,k±1). 8: End for Figure 1 Figure 2

Keywords: based stochastic; edge based; composite optimization; decentralized composite; optimization; gradient

Journal Title: International Journal of Control, Automation and Systems
Year Published: 2021

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.