LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

DropCircuit : A Modular Regularizer for Parallel Circuit Networks

Photo from archive.org

How to design and train increasingly large neural network models is a topic that has been actively researched for several years. However, while there exists a large number of studies… Click to show full abstract

How to design and train increasingly large neural network models is a topic that has been actively researched for several years. However, while there exists a large number of studies on training deeper and/or wider models, there is relatively little systematic research particularly on the effective usage of wide modular neural networks. Addressing this gap, and in an attempt to solve the problem of lengthy training times, we proposed Parallel Circuits (PCs), a biologically inspired architecture based on the design of the retina. In previous work we showed that this approach fails to maintain generalization performance in spite of achieving sharp speed gains. To address this issue, and motivated by the way dropout prevents node co-adaptation, in this paper, we suggest an improvement by extending dropout to the parallel-circuit architecture. The paper provides empirical proof and multiple insights into this combination. Experiments show promising results in which improved error rates are achieved in most cases, whilst maintaining the speed advantage of the PC approach.

Keywords: regularizer parallel; dropcircuit modular; parallel circuit; circuit; circuit networks; modular regularizer

Journal Title: Neural Processing Letters
Year Published: 2017

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.