Abstract Traditional nonlinear dimensionality reduction methods, such as multiple kernel dimensionality reduction and nonlinear spectral regression (SR), are generally regarded as extended versions of linear discriminant analysis (LDA) in the… Click to show full abstract
Abstract Traditional nonlinear dimensionality reduction methods, such as multiple kernel dimensionality reduction and nonlinear spectral regression (SR), are generally regarded as extended versions of linear discriminant analysis (LDA) in the supervised case. As is well known, LDA has the restrictive assumption that the data of each class is of a Gaussian distribution. Thus, the performance of these methods will be degraded if such an assumption is not hold. Although some methods based on marginal Fisher analysis are proposed to overcome the drawback of LDA, they have to solve the problem of dense metrics generalized eigenvalue decomposition, which is very time-consuming. To address these issues, in this paper, marginal Fisher analysis criterion based on extreme learning machine (ELM) is proposed to improve spectral regression and kernel marginal Fisher analysis. It is proved that the proposed marginal Fisher analysis is a special case of traditional kernel marginal Fisher analysis. Based on the proposed criterion, a novel supervised dimensionality reduction algorithm is presented by virtue of ELM and spectral regression. Experimental results on benchmark datasets validate that the proposed algorithm outperforms the state-of-the-art nonlinear dimensionality reduction methods in supervised scenarios.
               
Click one of the above tabs to view related content.