LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

On the Working Principle of the Hopfield Neural Networks and its Equivalence to the GADIA in Optimization

Photo by nordwood from unsplash

Hopfield neural networks (HNNs) are one of the most well-known and widely used kinds of neural networks in optimization. In this article, the author focuses on building a deeper understanding… Click to show full abstract

Hopfield neural networks (HNNs) are one of the most well-known and widely used kinds of neural networks in optimization. In this article, the author focuses on building a deeper understanding of the working principle of the HNN during an optimization process. Our investigations yield several novel results giving some important insights into the working principle of both continuous and discrete HNNs. This article shows that what the traditional HNN actually does as energy function decreases is to divide the neurons into two classes in such a way that the sum of biased class volumes is minimized (or maximized) regardless of the types of the optimization problems. Introducing neuron-specific class labels, the author concludes that the traditional discrete HNN is actually a special case of the greedy asynchronous distributed interference avoidance algorithm (GADIA) [17] of Babadi and Tarokh for the 2-class optimization problems. The computer results confirm the findings.

Keywords: optimization; principle hopfield; working principle; neural networks; networks equivalence; hopfield neural

Journal Title: IEEE Transactions on Neural Networks and Learning Systems
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.