LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Towards Better Generalization of Deep Neural Networks via Non-Typicality Sampling Scheme.

Photo by jordanmcdonald from unsplash

Improving the generalization performance of deep neural networks (DNNs) trained by minibatch stochastic gradient descent (SGD) has raised lots of concerns from deep learning practitioners. The standard simple random sampling… Click to show full abstract

Improving the generalization performance of deep neural networks (DNNs) trained by minibatch stochastic gradient descent (SGD) has raised lots of concerns from deep learning practitioners. The standard simple random sampling (SRS) scheme used in minibatch SGD treats all training samples equally in gradient estimation. In this article, we study a new data selection method based on the intrinsic property of the training set to help DNNs have better generalization performance. Our theoretical analysis suggests that this new sampling scheme, called the nontypicality sampling scheme, boosts the generalization performance of DNNs through biasing the solution toward wider minima, under certain assumptions. We confirm our findings experimentally and show that more variants of minibatch SGD can also benefit from the new sampling scheme. Finally, we discuss an extension of the nontypicality sampling scheme that holds promise to enhance both generalization performance and convergence speed of minibatch SGD.

Keywords: neural networks; sampling scheme; generalization performance; deep neural; generalization

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.