LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Approximation of Discounted Minimax Markov Control Problems and Zero-Sum Markov Games Using Hausdorff and Wasserstein Distances

Photo by charlesdeluvio from unsplash

This paper is concerned with a minimax control problem (also known as a robust Markov decision process (MDP) or a game against nature) with general state and action spaces under… Click to show full abstract

This paper is concerned with a minimax control problem (also known as a robust Markov decision process (MDP) or a game against nature) with general state and action spaces under the discounted cost optimality criterion. We are interested in approximating numerically the value function and an optimal strategy of this general discounted minimax control problem. To this end, we derive structural Lipschitz continuity properties of the solution of this robust MDP by imposing suitable conditions on the model, including Lipschitz continuity of the elements of the model and absolute continuity of the Markov transition kernel with respect to some probability measure $$\mu $$μ. Then, we are able to provide an approximating minimax control model with finite state and action spaces, and hence computationally tractable, by combining these structural properties with a suitable discretization procedure of the state space (related to a probabilistic criterion) and the action spaces (associated to a geometric criterion). Finally, it is shown that the corresponding approximation errors for the value function and the optimal strategy can be controlled in terms of the discretization parameters. These results are also extended to a two-player zero-sum Markov game.

Keywords: discounted minimax; approximation; control; sum markov; zero sum; minimax control

Journal Title: Dynamic Games and Applications
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.