AbstractWe present a new self-organized neural model that we term resilient self-organizing tissue (ReST), which can be run as a convolutional neural network, possesses a $$c^\infty $$c∞ energy function as… Click to show full abstract
AbstractWe present a new self-organized neural model that we term resilient self-organizing tissue (ReST), which can be run as a convolutional neural network, possesses a $$c^\infty $$c∞ energy function as well as a probabilistic interpretation of neural activities. The latter arises from the constraint of lognormal activity distribution over time that is enforced during ReST learning. The principal message of this article is that self-organized models in general are, due to their localized learning rule that updates only those units close to the best-matching unit, ideal representation learners for incremental learning architectures. We present such an architecture that uses ReST layers as a building block, benchmark its performance w.r.t. incremental learning in three real-world visual classification problems, and justify the mechanisms implemented in the architecture by dedicated experiments.
               
Click one of the above tabs to view related content.