Abstract We consider the approximation of transient (time dependent) probability distributions of discrete-state continuous-time Markov chains on large, possibly infinite state spaces. A framework for approximate adaptive uniformization is provided,… Click to show full abstract
Abstract We consider the approximation of transient (time dependent) probability distributions of discrete-state continuous-time Markov chains on large, possibly infinite state spaces. A framework for approximate adaptive uniformization is provided, which generalizes the well-known uniformization technique and many of its variants. Based on a birth process and a discrete-time Markov chain a computationally tractable approximating process/model is constructed. We investigate the theoretical properties of this process and prove that it yields computable lower and upper bounds for the desired transient probabilities. Finally, we discuss different specific ways of performing approximate adaptive uniformization and analyze the corresponding approximation errors. The application is illustrated by an example of a stochastic epidemic model.
               
Click one of the above tabs to view related content.