In this paper, we propose a new primal–dual algorithm for minimizing $$f({\mathbf {x}})+g({\mathbf {x}})+h({\mathbf {A}}{\mathbf {x}})$$f(x)+g(x)+h(Ax), where f, g, and h are proper lower semi-continuous convex functions, f is differentiable… Click to show full abstract
In this paper, we propose a new primal–dual algorithm for minimizing $$f({\mathbf {x}})+g({\mathbf {x}})+h({\mathbf {A}}{\mathbf {x}})$$f(x)+g(x)+h(Ax), where f, g, and h are proper lower semi-continuous convex functions, f is differentiable with a Lipschitz continuous gradient, and $${\mathbf {A}}$$A is a bounded linear operator. The proposed algorithm has some famous primal–dual algorithms for minimizing the sum of two functions as special cases. E.g., it reduces to the Chambolle–Pock algorithm when $$f=0$$f=0 and the proximal alternating predictor–corrector when $$g=0$$g=0. For the general convex case, we prove the convergence of this new algorithm in terms of the distance to a fixed point by showing that the iteration is a nonexpansive operator. In addition, we prove the O(1 / k) ergodic convergence rate in the primal–dual gap. With additional assumptions, we derive the linear convergence rate in terms of the distance to the fixed point. Comparing to other primal–dual algorithms for solving the same problem, this algorithm extends the range of acceptable parameters to ensure its convergence and has a smaller per-iteration cost. The numerical experiments show the efficiency of this algorithm.
               
Click one of the above tabs to view related content.