In this paper, we propose a regularized alternating direction method of multipliers (RADMM) for a class of nonconvex optimization problems. The algorithm does not require the regular term to be… Click to show full abstract
In this paper, we propose a regularized alternating direction method of multipliers (RADMM) for a class of nonconvex optimization problems. The algorithm does not require the regular term to be strictly convex. Firstly, we prove the global convergence of the algorithm. Secondly, under the condition that the augmented Lagrangian function satisfies the Kurdyka–Łojasiewicz property, the strong convergence of the algorithm is established. Finally, some preliminary numerical results are reported to support the efficiency of the proposed algorithm.
               
Click one of the above tabs to view related content.