LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Nonsynaptic Error Backpropagation in Long-Term Cognitive Networks

Photo by markusspiske from unsplash

We introduce a neural cognitive mapping technique named long-term cognitive network (LTCN) that is able to memorize long-term dependencies between a sequence of input and output vectors, especially in those… Click to show full abstract

We introduce a neural cognitive mapping technique named long-term cognitive network (LTCN) that is able to memorize long-term dependencies between a sequence of input and output vectors, especially in those scenarios that require predicting the values of multiple dependent variables at the same time. The proposed technique is an extension of a recently proposed method named short-term cognitive network that aims at preserving the expert knowledge encoded in the weight matrix while optimizing the nonlinear mappings provided by the transfer function of each neuron. A nonsynaptic, backpropagation-based learning algorithm powered by stochastic gradient descent is put forward to iteratively optimize four parameters of the generalized sigmoid transfer function associated with each neuron. Numerical simulations over 35 multivariate regression and pattern completion data sets confirm that the proposed LTCN algorithm attains statistically significant performance differences with respect to other well-known state-of-the-art methods.

Keywords: nonsynaptic error; term; error backpropagation; long term; term cognitive

Journal Title: IEEE Transactions on Neural Networks and Learning Systems
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.