LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Subspace Learning by $$\ell ^{0}$$ℓ0-Induced Sparsity

Photo by davidclode from unsplash

Subspace clustering methods partition the data that lie in or close to a union of subspaces in accordance with the subspace structure. Such methods with sparsity prior, such as sparse… Click to show full abstract

Subspace clustering methods partition the data that lie in or close to a union of subspaces in accordance with the subspace structure. Such methods with sparsity prior, such as sparse subspace clustering (SSC) (Elhamifar and Vidal in IEEE Trans Pattern Anal Mach Intell 35(11):2765–2781, 2013) with the sparsity induced by the $$\ell ^{1}$$ℓ1-norm, are demonstrated to be effective in subspace clustering. Most of those methods require certain assumptions, e.g. independence or disjointness, on the subspaces. However, these assumptions are not guaranteed to hold in practice and they limit the application of existing sparse subspace clustering methods. In this paper, we propose $$\ell ^{0}$$ℓ0-induced sparse subspace clustering ($$\ell ^{0}$$ℓ0-SSC). In contrast to the required assumptions, such as independence or disjointness, on subspaces for most existing sparse subspace clustering methods, we prove that $$\ell ^{0}$$ℓ0-SSC guarantees the subspace-sparse representation, a key element in subspace clustering, for arbitrary distinct underlying subspaces almost surely under the mild i.i.d. assumption on the data generation. We also present the “no free lunch” theorem which shows that obtaining the subspace representation under our general assumptions can not be much computationally cheaper than solving the corresponding $$\ell ^{0}$$ℓ0 sparse representation problem of $$\ell ^{0}$$ℓ0-SSC. A novel approximate algorithm named Approximate $$\ell ^{0}$$ℓ0-SSC (A$$\ell ^{0}$$ℓ0-SSC) is developed which employs proximal gradient descent to obtain a sub-optimal solution to the optimization problem of $$\ell ^{0}$$ℓ0-SSC with theoretical guarantee. The sub-optimal solution is used to build a sparse similarity matrix upon which spectral clustering is performed for the final clustering results. Extensive experimental results on various data sets demonstrate the superiority of A$$\ell ^{0}$$ℓ0-SSC compared to other competing clustering methods. Furthermore, we extend $$\ell ^{0}$$ℓ0-SSC to semi-supervised learning by performing label propagation on the sparse similarity matrix learnt by A$$\ell ^{0}$$ℓ0-SSC and demonstrate the effectiveness of the resultant semi-supervised learning method termed $$\ell ^{0}$$ℓ0-sparse subspace label propagation ($$\ell ^{0}$$ℓ0-SSLP).

Keywords: ell ssc; clustering methods; subspace clustering; subspace; sparse subspace

Journal Title: International Journal of Computer Vision
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.