LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Learnable Markov Chain Monte Carlo Sampling Methods for Lattice Gaussian Distribution

Photo by hollymindrup from unsplash

As a key ingredient of machine learning and artificial intelligence, the sampling algorithms with respect to lattice Gaussian distribution has emerged as an important problem in coding and decoding of… Click to show full abstract

As a key ingredient of machine learning and artificial intelligence, the sampling algorithms with respect to lattice Gaussian distribution has emerged as an important problem in coding and decoding of wireless communications. In this paper, based on the conventional Gibbs sampling, the learnable delayed metropolis-within-Gibbs (LDMWG) sampling algorithm is proposed to improve the convergence performance, which fully takes the advantages of the acceptance mechanism from the metropolis-hastings (MH) algorithm in the Markov chain Monte Carlo (MCMC) methods. The rejected candidate by the acceptance mechanism is utilized as a learnable experience for the generation of a new candidate at the same Markov move. In this way, the overall probability of remaining the same state at the Markov chain is greatly reduced, which leads to an improved convergence performance in the sense of Peskun ordering. Moreover, in order to reduce the complexity cost during the Markov mixing, a symmetric sampling structure which greatly simplified the sampling operation is further introduced and the symmetric learnable delayed metropolis-within-Gibbs (SLDMWG) sampling algorithm is given. Finally, the simulation results based on multi-input multi-output (MIMO) detections are presented to confirm the convergence gain and the complexity reduction brought by the proposed sampling schemes.

Keywords: lattice gaussian; chain monte; markov chain; gaussian distribution; markov

Journal Title: IEEE Access
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.