LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Approximation methods for piecewise deterministic Markov processes and their costs

Photo by pask_07 from unsplash

ABSTRACT In this paper, we analyse piecewise deterministic Markov processes (PDMPs), as introduced in Davis (1984). Many models in insurance mathematics can be formulated in terms of the general concept… Click to show full abstract

ABSTRACT In this paper, we analyse piecewise deterministic Markov processes (PDMPs), as introduced in Davis (1984). Many models in insurance mathematics can be formulated in terms of the general concept of PDMPs. There one is interested in computing certain quantities of interest such as the probability of ruin or the value of an insurance company. Instead of explicitly solving the related integro-(partial) differential equation (an approach which can only be used in few special cases), we adapt the problem in a manner that allows us to apply deterministic numerical integration algorithms such as quasi-Monte Carlo rules; this is in contrast to applying random integration algorithms such as Monte Carlo. To this end, we reformulate a general cost functional as a fixed point of a particular integral operator, which allows for iterative approximation of the functional. Furthermore, we introduce a smoothing technique which is applied to the integrands involved, in order to use error bounds for deterministic cubature rules. We prove a convergence result for our PDMPs approximation, which is of independent interest as it justifies phase-type approximations on the process level. We illustrate the smoothing technique for a risk-theoretic example, and compare deterministic and Monte Carlo integration.

Keywords: approximation; monte carlo; piecewise deterministic; deterministic markov; markov processes

Journal Title: Scandinavian Actuarial Journal
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.