LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Beyond Backpropagation: Bilevel Optimization Through Implicit Differentiation and Equilibrium Propagation

Photo by aliidu from unsplash

Abstract This review examines gradient-based techniques to solve bilevel optimization problems. Bilevel optimization extends the loss minimization framework underlying statistical learning to systems that are implicitly defined through a quantity… Click to show full abstract

Abstract This review examines gradient-based techniques to solve bilevel optimization problems. Bilevel optimization extends the loss minimization framework underlying statistical learning to systems that are implicitly defined through a quantity they minimize. This characterization can be applied to neural networks, optimizers, algorithmic solvers, and even physical systems and allows for greater modeling flexibility compared to the usual explicit definition of such systems. We focus on solving learning problems of this kind through gradient descent, leveraging the toolbox of implicit differentiation and, for the first time applied to this setting, the equilibrium propagation theorem. We present the mathematical foundations behind such methods, introduce the gradient estimation algorithms in detail, and compare the competitive advantages of the different approaches.

Keywords: bilevel optimization; implicit differentiation; equilibrium propagation

Journal Title: Neural Computation
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.