LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Self-Paced Enhanced Low-Rank Tensor Kernelized Multi-View Subspace Clustering

Photo by brnkd from unsplash

This paper addresses the multi-view subspace clustering problem and proposes the self-paced enhanced low-rank tensor kernelized multi-view subspace clustering (SETKMC) method, which is based on two motivations: (1) singular values… Click to show full abstract

This paper addresses the multi-view subspace clustering problem and proposes the self-paced enhanced low-rank tensor kernelized multi-view subspace clustering (SETKMC) method, which is based on two motivations: (1) singular values of the representations and multiple instances should be treated differently. The reasons are that larger singular values of the representations usually quantify the major information and should be less penalized; samples with different degrees of noise may have various reliability for clustering. (2) many existing methods may cause the degraded performance when multi-view features reside in different nonlinear subspaces. This is because they usually assumed that multiple features lie within the union of several linear subspaces. SETKMC integrates the nonconvex tensor norm, self-paced learning, and kernel trick into a unified model for multi-view subspace clustering. The nonconvex tensor norm imposes different weights on different singular values. The self-paced learning gradually involves instances from more reliable to less reliable ones while the kernel trick aims to handle the multi-view data in nonlinear subspaces. One iterative algorithm is proposed based on the alternating direction method of multipliers. Extensive results on seven real-world datasets show the effectiveness of the proposed SETKMC compared to fifteen state-of-the-art multi-view clustering methods.

Keywords: view; view subspace; multi view; subspace clustering; self paced

Journal Title: IEEE Transactions on Multimedia
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.