Multi-view subspace clustering methods used consensus and supplementary principles to learn the shared self-representation matrix or tensor have been applied to multiple fields. The existing advanced multi-view subspace clustering methods… Click to show full abstract
Multi-view subspace clustering methods used consensus and supplementary principles to learn the shared self-representation matrix or tensor have been applied to multiple fields. The existing advanced multi-view subspace clustering methods are mainly based on the extension of low-rank representation from matrix to tensor. However, the tensor optimization methods have two limitations: they cannot retain the local geometric structure of data features residing in multiple nonlinear subspaces; they represent the low-rank structure based on the tensor nuclear norm, which will cause undesirable low-rank approximation. To solve these problems, we propose a hyper-Laplacian regularized Nonconvex Low-rank Representation (HNLR) method for multi-view subspace clustering. HNLR uses hyper-Laplacian regularizer to capture the high-order local geometry structure of each view. In addition, by introducing a nonconvex Laplace function to replace the tensor nuclear norm, HNLR can greatly improve the approximate performance of the global low-rank structure. Based on the alternating direction method of multiplier, we design an effective alternate iteration strategy to optimize HNLR model. Experimental results on eight real datasets have proved the superiority of our proposed method.
               
Click one of the above tabs to view related content.