Dimension reduction plays an essential role when decreasing the complexity of solving large-scale problems. The well-known Johnson–Lindenstrauss (JL) lemma and restricted isometry property (RIP) admit the use of random projection… Click to show full abstract
Dimension reduction plays an essential role when decreasing the complexity of solving large-scale problems. The well-known Johnson–Lindenstrauss (JL) lemma and restricted isometry property (RIP) admit the use of random projection to reduce the dimension while keeping the Euclidean distance, which leads to the boom of compressed sensing and the field of sparsity related signal processing. Recently, successful applications of sparse models in computer vision and machine learning have increasingly hinted that the underlying structure of high dimensional data looks more like a union of subspaces. In this paper, motivated by JL lemma and an emerging field of compressed subspace clustering, we study for the first time the RIP of Gaussian random matrices for the compression of two subspaces based on the generalized projection $F$-norm distance. We theoretically prove that with high probability the affinity or distance between two projected subspaces are concentrated around their estimates. When the ambient dimension after projection is sufficiently large, the affinity and distance between two subspaces almost remain unchanged after random projection. Numerical experiments verify the theoretical work.
               
Click one of the above tabs to view related content.