Signal processing applications of semidefinite optimization are often rooted in sum-of-squares representations of nonnegative trigonometric polynomials. Interior-point solvers for semidefinite optimization can handle constraints of this form with a per-iteration-complexity… Click to show full abstract
Signal processing applications of semidefinite optimization are often rooted in sum-of-squares representations of nonnegative trigonometric polynomials. Interior-point solvers for semidefinite optimization can handle constraints of this form with a per-iteration-complexity that is cubic in the degree of the trigonometric polynomial. The purpose of this paper is to discuss first-order methods with a lower complexity per iteration. The methods are based on generalized proximal operators defined in terms of the Itakura–Saito distance. This is the Bregman distance defined by the negative entropy function. The choice for the Itakura–Saito distance is motivated by the fact that the associated generalized projection on the set of normalized nonnegative trigonometric polynomials can be computed at a cost that is roughly quadratic in the degree of the polynomial. The generalized projection is the key operation in generalized proximal first-order methods that use Bregman distances instead of the squared Euclidean distance. The paper includes numerical results with Auslender and Teboulle's accelerated proximal gradient method for Bregman distances.
               
Click one of the above tabs to view related content.