LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

On Consistent Entropy-Regularized k-Means Clustering With Feature Weight Learning: Algorithm and Statistical Analyses.

Photo by titouhwayne from unsplash

Clusters in real data are often restricted to low-dimensional subspaces rather than the entire feature space. Recent approaches to circumvent this difficulty are often computationally inefficient and lack theoretical justification… Click to show full abstract

Clusters in real data are often restricted to low-dimensional subspaces rather than the entire feature space. Recent approaches to circumvent this difficulty are often computationally inefficient and lack theoretical justification in terms of their large-sample behavior. This article deals with the problem by introducing an entropy incentive term to efficiently learn the feature importance within the framework of center-based clustering. A scalable block-coordinate descent algorithm, with closed-form updates, is incorporated to minimize the proposed objective function. We establish theoretical guarantees on our method by Vapnik-Chervonenkis (VC) theory to establish strong consistency along with uniform concentration bounds. The merits of our method are showcased through detailed experimental analysis on toy examples as well as real data clustering benchmarks.

Keywords: consistent entropy; feature; clustering feature; entropy regularized; means clustering; regularized means

Journal Title: IEEE transactions on cybernetics
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.