LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Novel and Efficient Influence-Seeking Exploration in Deep Multi-Agent Reinforcement Learning

Photo by guillediaz from unsplash

Although recent years witnessed notable success for a cooperative setting in multi-agent reinforcement learning (MARL), efficient explorations are still challenging primarily due to the complex dynamics of inter-agent interactions constituting… Click to show full abstract

Although recent years witnessed notable success for a cooperative setting in multi-agent reinforcement learning (MARL), efficient explorations are still challenging primarily due to the complex dynamics of inter-agent interactions constituting the high dimension of action spaces. For an efficient exploration, it is necessary to quantify influences that can represent interactions among agents and use them to obtain more information about the complexity of multi-agent systems. In this paper, we propose a novel influence-seeking exploration (ISE) scheme, which encourages agents to preferably explore action spaces significantly influenced by others and thus helps in speeding up the learning curve. To measure the influence of other agents in action selection, we use the variance of joint action-values with different action sets of agents that obtained by an estimation technique to lessen computation overhead. To this end, we first present an analytical approach inspired by the concept of approximated variance propagation and then apply it to an exploration scheme. We evaluate the proposed exploration method on a set of StarCraft II micromanagement as well as modified predator-prey tasks. Compared to state-of-the-art methods, the proposed method achieved performance improvements of 10% in StarCraft II micromanagement and 50% in modified predator-prey tasks approximately.

Keywords: agent reinforcement; influence; action; exploration; multi agent

Journal Title: IEEE Access
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.