LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Semi-Supervised Feature Selection via Sparse Rescaled Linear Square Regression

Photo by anniespratt from unsplash

With the rapid increase of the data size, it has increasing demands for selecting features by exploiting both labeled and unlabeled data. In this paper, we propose a novel semi-supervised… Click to show full abstract

With the rapid increase of the data size, it has increasing demands for selecting features by exploiting both labeled and unlabeled data. In this paper, we propose a novel semi-supervised embedded feature selection method. The new method extends the least square regression model by rescaling the regression coefficients in the least square regression with a set of scale factors, which is used for evaluating the importance of features. An iterative algorithm is proposed to optimize the new model. It has been proved that solving the new model is equivalent to solving a sparse model with a flexible and adaptable $\ell _{2,p}$2,p norm regularization. Moreover, the optimal solution of scale factors provides a theoretical explanation for why we can use $\lbrace \left\Vert \mathbf {w}^{1} \right\Vert _{2},\ldots, \left\Vert \mathbf {w}^{d} \right\Vert _{2}\rbrace${w12,...,wd2} to evaluate the importance of features. Experimental results on eight benchmark data sets show the superior performance of the proposed method.

Keywords: regression; mml; mml msub; math; mml mml; msub mml

Journal Title: IEEE Transactions on Knowledge and Data Engineering
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.