LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Incremental learning with a homeostatic self-organizing neural model

Photo from wikipedia

AbstractWe present a new self-organized neural model that we term resilient self-organizing tissue (ReST), which can be run as a convolutional neural network, possesses a $$c^\infty $$c∞ energy function as… Click to show full abstract

AbstractWe present a new self-organized neural model that we term resilient self-organizing tissue (ReST), which can be run as a convolutional neural network, possesses a $$c^\infty $$c∞ energy function as well as a probabilistic interpretation of neural activities. The latter arises from the constraint of lognormal activity distribution over time that is enforced during ReST learning. The principal message of this article is that self-organized models in general are, due to their localized learning rule that updates only those units close to the best-matching unit, ideal representation learners for incremental learning architectures. We present such an architecture that uses ReST layers as a building block, benchmark its performance w.r.t. incremental learning in three real-world visual classification problems, and justify the mechanisms implemented in the architecture by dedicated experiments.

Keywords: neural model; incremental learning; self; learning homeostatic; self organizing

Journal Title: Neural Computing and Applications
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.