LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

An Augmented Game Approach for Design and Analysis of Distributed Learning Dynamics in Multiagent Games.

Photo by codioful from unsplash

In this article, an augmented game approach is proposed for the formulation and analysis of distributed learning dynamics in multiagent games. Through the design of the augmented game, the coupling… Click to show full abstract

In this article, an augmented game approach is proposed for the formulation and analysis of distributed learning dynamics in multiagent games. Through the design of the augmented game, the coupling structure of utility functions among all the players can be reformulated into an arbitrary undirected connected network while the Nash equilibria are preserved. In this case, any full-information game learning dynamics can be recast into a distributed form, and its convergence can be determined from the structure of the augmented game. We apply the proposed approach to generate both deterministic and stochastic distributed gradient play and obtain several negative convergent results about the distributed gradient play: 1) a Nash equilibrium is convergent under the classic gradient play, yet its corresponding augmented Nash equilibrium may be not convergent under the distributed gradient play and, on the other side, 2) a Nash equilibrium is not convergent under the classic gradient play, yet its corresponding augmented Nash equilibrium may be convergent under the distributed gradient play. In particular, we show that the variational stability structure (including monotonicity as a special case) of a game is not guaranteed to be preserved in its augmented game. These results provide a systematic methodology about how to formulate and then analyze the feasibility of distributed game learning dynamics.

Keywords: gradient play; augmented game; game; learning dynamics; approach

Journal Title: IEEE transactions on cybernetics
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.