LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Rank–sparsity balanced representation for subspace clustering

Photo by kmitchhodge from unsplash

Subspace learning has many applications such as motion segmentation and image recognition. The existing algorithms based on self-expressiveness of samples for subspace learning may suffer from the unsuitable balance between… Click to show full abstract

Subspace learning has many applications such as motion segmentation and image recognition. The existing algorithms based on self-expressiveness of samples for subspace learning may suffer from the unsuitable balance between the rank and sparsity of the expressive matrix. In this paper, a new model is proposed that can balance the rank and sparsity well. This model adopts the log-determinant function to control the rank of solution. Meanwhile, the diagonals are penalized, rather than the strict zero-restriction on diagonals. This strategy makes the rank–sparsity balance more tunable. We furthermore give a new graph construction from the low-rank and sparse solution, which absorbs the advantages of the graph constructions in the sparse subspace clustering and the low-rank representation for further clustering. Numerical experiments show that the new method, named as RSBR, can significantly increase the accuracy of subspace clustering on the real-world data sets that we tested.

Keywords: rank sparsity; representation; subspace; subspace clustering

Journal Title: Machine Vision and Applications
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.