LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Variable selection and parameter estimation via WLAD–SCAD with a diverging number of parameters

Photo by kellysikkema from unsplash

Abstract In this paper, we focus on the variable selection based on the weighted least absolute deviation (WLAD) regression with the diverging number of parameters. The WLAD estimator and the… Click to show full abstract

Abstract In this paper, we focus on the variable selection based on the weighted least absolute deviation (WLAD) regression with the diverging number of parameters. The WLAD estimator and the smoothly clipped absolute deviation (SCAD) are combined to achieve robust parameter estimation and variable selection in regression simultaneously. Compared with the LAD–SCAD method, the WLAD–SCAD method will resist the heavy-tailed errors and outliers in explanatory variables. Furthermore, we obtain consistency and asymptotic normality of the estimators under certain appropriate conditions. Simulation studies and a real example are provided to demonstrate the superiority of the WLAD–SCAD method over the other regularization methods in the presence of outliers in the explanatory variables and the heavy-tailed error distribution.

Keywords: wlad scad; scad; diverging number; variable selection; number parameters

Journal Title: Journal of The Korean Statistical Society
Year Published: 2017

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.