LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Blockwise Recursive Moore–Penrose Inverse for Network Learning

Photo from wikipedia

Training neural networks with the Moore–Penrose (MP) inverse has recently gained attention in view of its noniterative training nature. However, a significant drawback of learning based on the MP inverse… Click to show full abstract

Training neural networks with the Moore–Penrose (MP) inverse has recently gained attention in view of its noniterative training nature. However, a significant drawback of learning based on the MP inverse is that the computational memory consumption grows along with the size of a dataset. In this article, based on the partitioning of the MP inverse, we propose a blockwise recursive MP inverse formulation (BRMP) for network learning with low-memory property while preserving its training effectiveness. The BRMP is an equivalent formulation to its batchwise counterpart since neither approximation nor assumption is made in the derivation process. Our further exploration of this recursive method leads to a switching structure among three different scenarios. This structure also reveals that the well-known recursive least squares method is a special case of our proposed technique. Subsequently, we apply BRMP to the training of radial basis function networks as well as multilayer perceptrons. The experimental validation covers both regression and classification tasks.

Keywords: moore penrose; inverse; network learning; blockwise recursive; penrose inverse

Journal Title: IEEE Transactions on Systems, Man, and Cybernetics: Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.