LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression

Photo from wikipedia

Abstract Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which… Click to show full abstract

Abstract Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively. In this paper, we propose a generalized version of EP called generalized EP (GEP), which is a new method based on the minimization of KL divergence for approximate inference. However, when the variance of the gradient is large, the algorithm may need a long time to converge. We use control variates and develop a variance reduced version of this method called GEP-CV. We evaluate our approach on Bayesian logistic regression, which provides faster convergence and better performance than other state-of-the-art approaches.

Keywords: expectation propagation; bayesian logistic; logistic regression

Journal Title: Neurocomputing
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.