LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Renewal theory for transient Markov chains with asymptotically zero drift

Photo from wikipedia

We solve the problem of asymptotic behaviour of the renewal measure (Green function) generated by a transient Lamperti's Markov chain $X_n$ in $\mathbf R$, that is, when the drift of… Click to show full abstract

We solve the problem of asymptotic behaviour of the renewal measure (Green function) generated by a transient Lamperti's Markov chain $X_n$ in $\mathbf R$, that is, when the drift of the chain tends to zero at infinity. Under this setting, the average time spent by $X_n$ in the interval $(x,x+1]$ is roughly speaking the reciprocal of the drift and tends to infinity as $x$ grows. For the first time we present a general approach relying in a diffusion approximation to prove renewal theorems for Markov chains. We apply a martingale type technique and show that the asymptotic behaviour of the renewal measure heavily depends on the rate at which the drift vanishes. The two main cases are distinguished, either the drift of the chain decreases as $1/x$ or much slower than that, say as $1/x^\alpha$ for some $\alpha\in(0,1)$. The intuition behind how the renewal measure behaves in these two cases is totally different. While in the first case $X_n^2/n$ converges weakly to a $\Gamma$-distribution and there is no law of large numbers available, in the second case a strong law of large numbers holds true for $X_n^{1+\alpha}/n$ and further normal approximation is available.

Keywords: drift; markov chains; renewal measure; renewal; renewal theory

Journal Title: Transactions of the American Mathematical Society
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.