LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Critical Behavior and Universality Classes for an Algorithmic Phase Transition in Sparse Reconstruction

Photo from wikipedia

Recovery of an N-dimensional, K-sparse solution $${\mathbf {x}}$$x from an M-dimensional vector of measurements $${\mathbf {y}}$$y for multivariate linear regression can be accomplished by minimizing a suitably penalized least-mean-square cost… Click to show full abstract

Recovery of an N-dimensional, K-sparse solution $${\mathbf {x}}$$x from an M-dimensional vector of measurements $${\mathbf {y}}$$y for multivariate linear regression can be accomplished by minimizing a suitably penalized least-mean-square cost $$||{\mathbf {y}}-{\mathbf {H}} {\mathbf {x}}||_2^2+\lambda V({\mathbf {x}})$$||y-Hx||22+λV(x). Here $${\mathbf {H}}$$H is a known matrix and $$V({\mathbf {x}})$$V(x) is an algorithm-dependent sparsity-inducing penalty. For ‘random’ $${\mathbf {H}}$$H, in the limit $$\lambda \rightarrow 0$$λ→0 and $$M,N,K\rightarrow \infty $$M,N,K→∞, keeping $$\rho =K/N$$ρ=K/N and $$\alpha =M/N$$α=M/N fixed, exact recovery is possible for $$\alpha $$α past a critical value $$\alpha _c = \alpha (\rho )$$αc=α(ρ). Assuming $${\mathbf {x}}$$x has iid entries, the critical curve exhibits some universality, in that its shape does not depend on the distribution of $${\mathbf {x}}$$x. However, the algorithmic phase transition occurring at $$\alpha =\alpha _c$$α=αc and associated universality classes remain ill-understood from a statistical physics perspective, i.e. in terms of scaling exponents near the critical curve. In this article, we analyze the mean-field equations for two algorithms, Basis Pursuit ($$V({\mathbf {x}})=||{\mathbf {x}}||_{1} $$V(x)=||x||1) and Elastic Net ($$V({\mathbf {x}})= ||{\mathbf {x}}||_{1} + \tfrac{g}{2} ||{\mathbf {x}}||_{2}^2$$V(x)=||x||1+g2||x||22) and show that they belong to different universality classes in the sense of scaling exponents, with mean squared error (MSE) of the recovered vector scaling as $$\lambda ^\frac{4}{3}$$λ43 and $$\lambda $$λ respectively, for small $$\lambda $$λ on the critical line. In the presence of additive noise, we find that, when $$\alpha >\alpha _c$$α>αc, MSE is minimized at a non-zero value for $$\lambda $$λ, whereas at $$\alpha =\alpha _c$$α=αc, MSE always increases with $$\lambda $$λ.

Keywords: alpha alpha; mathbf; mathbf mathbf; alpha; universality classes

Journal Title: Journal of Statistical Physics
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.