A novel method, termed Chebyshev periodical successive over-relaxation (PSOR), for accelerating the convergence speed of fixed-point iterations is presented. Chebyshev PSOR can be regarded as a variant of successive over-relaxation… Click to show full abstract
A novel method, termed Chebyshev periodical successive over-relaxation (PSOR), for accelerating the convergence speed of fixed-point iterations is presented. Chebyshev PSOR can be regarded as a variant of successive over-relaxation utilizing the inverse of roots of a Chebyshev polynomial as iteration-dependent PSOR factors. One of the most notable features of the proposed method is that it can be applied to nonlinear fixed-point iterations in addition to linear fixed-point iterations. From several numerical experiments, it is shown that Chebyshev PSOR leads to faster convergence for wide classes of linear and non-linear fixed-point iterations including proximal gradient methods such as ISTA.
               
Click one of the above tabs to view related content.