LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A noise-based stabilizer for convolutional neural networks

Photo from academic.microsoft.com

ABSTRACT Overfitting occurs when one tries to train a large model on small amount of data. Regularizing a neural network using prior knowledge remains a topic of research as it… Click to show full abstract

ABSTRACT Overfitting occurs when one tries to train a large model on small amount of data. Regularizing a neural network using prior knowledge remains a topic of research as it is not concluded how much prior information can be given to the neural network. In this paper, a novel algorithm is introduced which uses regularization to train a neural network without increasing the dataset. A trivial prior information of a class label is supplied to the model while training. Laplace noise is introduced to the intermediate layer for more generalization. The results show significant improvement in accuracy on the standard datasets for a simple Convolutional Neural Network (CNN). While the proposed method outperforms previous regularization techniques like dropout and batch normalization, it can also be applied with them for further improvement in the performance. On the variants of MNIST, proposed algorithm achieved an average 48% increment in the test accuracy.

Keywords: noise based; convolutional neural; stabilizer convolutional; neural network; based stabilizer

Journal Title: Journal of Statistical Computation and Simulation
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.