Multi-block separable convex problems recently received considerable attention. Optimization problems of this type minimize separable convex objective functions with linear constraints. Challenges encountered in algorithmic development applying the classic alternating… Click to show full abstract
Multi-block separable convex problems recently received considerable attention. Optimization problems of this type minimize separable convex objective functions with linear constraints. Challenges encountered in algorithmic development applying the classic alternating direction method of multipliers (ADMM) come from the fact that ADMM for the optimization problems of this type is not necessarily convergent. However, it is observed that ADMM applying to problems of this type outperforms numerically many of its variants with guaranteed theoretical convergence. The goal of this paper is to develop convergent and computationally efficient algorithms for solving multi-block separable convex problems. We first characterize the solutions of the optimization problems by proximity operators of the convex functions involved in their objective functions. We then design a class of two-step fixed-point iterative schemes for solving these problems based on the characterization. We further prove convergence of the iterative schemes and show that they have $$O\left( \frac{1}{k}\right) $$O1k of convergence rate in the ergodic sense and the sense of the partial primal-dual gap, where k denotes the iteration number. Moreover, we derive specific two-step fixed-point proximity algorithms (2SFPPA) from the proposed iterative schemes and establish their global convergence. Numerical experiments for solving the sparse MRI problem demonstrate the numerical efficiency of the proposed 2SFPPA.
               
Click one of the above tabs to view related content.