Spectral clustering (SC) is a well-performed and prevalent technique for data processing and analysis, which has attracted significant attention in the field of clustering. While the scalability and generalization ability… Click to show full abstract
Spectral clustering (SC) is a well-performed and prevalent technique for data processing and analysis, which has attracted significant attention in the field of clustering. While the scalability and generalization ability of this method make it prohibitive for the large-scale dataset and the out-of-sample-extension problem. In this work, we propose a new efficient deep clustering architecture based on SC, named deep SC (DSC) with constrained Laplacian rank (DSCCLR). DSCCLR develops a self-adaptive affinity matrix with a clustering-friendly structure by constraining the Laplacian rank, which greatly mines the intrinsic relationships. Meanwhile, by introducing a simple fully connected network with an orthogonality constraint on the last layer, DSCCLR learns discriminative representations in a short training time. The proposed method has the following salient properties: 1) it overcomes limited generalization ability and scalability of the existing DSC methods; 2) it explores the intrinsic relationship between samples in the affinity matrix, which maintains the latent manifold of data as much as possible; and 3) it alleviates the complexity of eigendecomposition via a simple but effective fully connected network. The extensive empirical results demonstrate the superiorities of DSCCLR over other 17 clustering methods.
               
Click one of the above tabs to view related content.