Covariance matrices have attracted increasing attention for data representation in many computer vision tasks. The nonsingular covariance matrices are regarded as points on Riemannian manifolds rather than Euclidean space. A… Click to show full abstract
Covariance matrices have attracted increasing attention for data representation in many computer vision tasks. The nonsingular covariance matrices are regarded as points on Riemannian manifolds rather than Euclidean space. A common technique for classification on Riemannian manifolds is to embed the covariance matrices into a reproducing kernel Hilbert space (RKHS), and then construct a map from RKHS to Euclidean space, while the explicit map from RKHS to Euclidean space in most kernel-based methods only depends on a linear hypothesis. In this paper, we propose a subspace learning framework to project Riemannian manifolds to Euclidean space, and give the theoretical derivation for it. Specifically, the Euclidean space is isomorphic to the subspace of RKHS. Under the framework, firstly we define an improved Log-Euclidean Gaussian radial basis function kernel for embedding. The first order statistical features of input images are incorporated into the kernel function to increase the discriminative power. After that we seek the optimal projection matrix of the subspace of the RKHS by conducting a graph embedding discriminant analysis. Texture recognition and object categorization experiments with region covariance descriptors demonstrate the considerable effectiveness of the improved Log-Euclidean Gaussian RBK kernel and the proposed method.
               
Click one of the above tabs to view related content.