LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look

Photo by markusspiske from unsplash

The past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL’s literature, it is seen that,… Click to show full abstract

The past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL’s literature, it is seen that, under non-Gaussian assumptions, especially when the data are corrupted by heavy-tailed or multi-modal non-Gaussian distributions, information theoretic criteria [such as minimum error entropy (MEE)] outperform second order statistical ones. The objective of this research is to investigate this better performance of MEE criterion against that of minimum mean square error. Having found similar results for MEE- and MSE-based methods, in the non-Gaussian environment under particular conditions, we need a precise demarcation between this occasional similarity and occasional outperformance. Based on the theoretic findings, we reveal a better touchstone for the outperformance of MEE versus MSE.

Keywords: minimum error; error entropy; error; mean square; square error; minimum mean

Journal Title: IEEE Access
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.