LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Recursive Reasoning With Reduced Complexity and Intermittency for Nonequilibrium Learning in Stochastic Games.

Photo by huefnerdesign from unsplash

In this article, we propose a computationally and communicationally efficient approach for decision-making in nonequilibrium stochastic games. In particular, due to the inherent complexity of computing Nash equilibria, as well… Click to show full abstract

In this article, we propose a computationally and communicationally efficient approach for decision-making in nonequilibrium stochastic games. In particular, due to the inherent complexity of computing Nash equilibria, as well as the innate tendency of agents to choose nonequilibrium strategies, we construct two models of bounded rationality based on recursive reasoning. In the first model, named level-k thinking, each agent assumes that everyone else has a cognitive level immediately lower than theirs and--given such an assumption--chooses their policy to be a best response to them. In the second model, named cognitive hierarchy, each agent conjectures that the rest of the agents have a cognitive level that is lower than theirs, but follows a distribution instead of being deterministic. To explicitly compute the boundedly rational policies, a level-recursive algorithm and a level-paralleled algorithm are constructed, where the latter one can have an overall reduced computational complexity. To further reduce the complexity in the communication layer, modifications of the proposed nonequilibrium strategies are presented, which do not require the action of a boundedly rational agent to be updated at each step of the stochastic game. Simulations are performed that demonstrate our results.

Keywords: nonequilibrium; reasoning reduced; recursive reasoning; level; stochastic games; complexity

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.