LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Efficient Sampling of Bernoulli-Gaussian-Mixtures for Sparse Signal Restoration

Photo by introspectivedsgn from unsplash

This paper introduces a new family of prior models called Bernoulli-Gaussian-Mixtures (BGM), with a view to efficiently address sparse linear inverse problems or sparse linear regression, in the Bayesian framework.… Click to show full abstract

This paper introduces a new family of prior models called Bernoulli-Gaussian-Mixtures (BGM), with a view to efficiently address sparse linear inverse problems or sparse linear regression, in the Bayesian framework. The BGM family is based on continuous Location and Scale Mixtures of Gaussians (LSMG), which includes a wide range of symmetric and asymmetric heavy-tailed probability distributions. Particular attention is paid to the decomposition of probability laws as Gaussian mixtures, from which we derive a Partially Collapsed Gibbs Sampler (PCGS) for the BGM, in a systematic way. PCGS is shown to be more efficient than the standard Gibbs sampler, both in terms of number of iterations and CPU time. Moreover, special attention is paid to BGM involving a density defined over a real half-line. An asymptotically exact LSMG approximation is introduced, which allows us to expand the applicability of PCGS to cases such as BGM models with a non-negative support.

Keywords: mixtures sparse; sparse signal; gaussian mixtures; sampling bernoulli; efficient sampling; bernoulli gaussian

Journal Title: IEEE Transactions on Signal Processing
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.