This paper analyzes the iteration-complexity of a generalized alternating direction method of multipliers (G-ADMM) for solving separable linearly constrained convex optimization problems. This ADMM variant, first proposed by Bertsekas and… Click to show full abstract
This paper analyzes the iteration-complexity of a generalized alternating direction method of multipliers (G-ADMM) for solving separable linearly constrained convex optimization problems. This ADMM variant, first proposed by Bertsekas and Eckstein, introduces a relaxation parameter $$\alpha $$α into the second ADMM subproblem in order to improve its computational performance. It is shown that, for a given tolerance $$\varepsilon >0$$ε>0, the G-ADMM with $$\alpha \in (0, 2)$$α∈(0,2) provides, in at most $${\mathcal {O}}(1/\varepsilon ^2)$$O(1/ε2) iterations, an approximate solution of the Lagrangian system associated to the optimization problem under consideration. It is further demonstrated that, in at most $${\mathcal {O}}(1/\varepsilon )$$O(1/ε) iterations, an approximate solution of the Lagrangian system can be obtained by means of an ergodic sequence associated to a sequence generated by the G-ADMM with $$\alpha \in (0, 2]$$α∈(0,2]. Our approach consists of interpreting the G-ADMM as an instance of a hybrid proximal extragradient framework with some special properties. Some preliminary numerical experiments are reported in order to confirm that the use of $$\alpha >1$$α>1 can lead to a better numerical performance than $$\alpha =1$$α=1 (which corresponds to the standard ADMM).
               
Click one of the above tabs to view related content.