LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Generalized Embedding Regression: A Framework for Supervised Feature Extraction.

Photo from wikipedia

Sparse discriminative projection learning has attracted much attention due to its good performance in recognition tasks. In this article, a framework called generalized embedding regression (GER) is proposed, which can… Click to show full abstract

Sparse discriminative projection learning has attracted much attention due to its good performance in recognition tasks. In this article, a framework called generalized embedding regression (GER) is proposed, which can simultaneously perform low-dimensional embedding and sparse projection learning in a joint objective function with a generalized orthogonal constraint. Moreover, the label information is integrated into the model to preserve the global structure of data, and a rank constraint is imposed on the regression matrix to explore the underlying correlation structure of classes. Theoretical analysis shows that GER can obtain the same or approximate solution as some related methods with special settings. By utilizing this framework as a general platform, we design a novel supervised feature extraction approach called jointly sparse embedding regression (JSER). In JSER, we construct an intrinsic graph to characterize the intraclass similarity and a penalty graph to indicate the interclass separability. Then, the penalty graph Laplacian is used as the constraint matrix in the generalized orthogonal constraint to deal with interclass marginal points. Moreover, the L2,1-norm is imposed on the regression terms for robustness to outliers and data's variations and the regularization term for jointly sparse projection learning, leading to interesting semantic interpretability. An effective iterative algorithm is elaborately designed to solve the optimization problem of JSER. Theoretically, we prove that the subproblem of JSER is essentially an unbalanced Procrustes problem and can be solved iteratively. The convergence of the designed algorithm is also proved. Experimental results on six well-known data sets indicate the competitive performance and latent properties of JSER.

Keywords: generalized embedding; regression; framework; supervised feature; embedding regression; feature extraction

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.