LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Sparse neural network regression with variable selection

Photo by dulhiier from unsplash

This paper reports on our study of a sparse neural network regression method based on a single hidden layer architecture and sparsity‐inducing penalties. To determine the size of network in… Click to show full abstract

This paper reports on our study of a sparse neural network regression method based on a single hidden layer architecture and sparsity‐inducing penalties. To determine the size of network in a data‐adaptive way, we adopt the lasso and group lasso penalty functions to simultaneously induce sparsity at node and predictor levels. We also devise several techniques to improve performance of the neural network regression estimator. We adopt a B‐spline activation function with a compact support to identify local trends of data. In addition, we develop an algorithm based on node addition process, in which additional nodes come into the network as a complexity parameter decreases. At each value of the complexity parameter, the algorithm initializes the additional nodes along the direction that best captures the unexplained functional relationship between the response and predictors. The optimization step is conducted based on an efficient coordinate descent algorithm. Numerical studies based on simulated and real datasets illustrate that the combination of the aforementioned devices significantly improves the performance of the neural network estimator.

Keywords: neural network; network; regression variable; sparse neural; network regression

Journal Title: Computational Intelligence
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.