Harris chain

In the mathematical study of stochastic processes, a Harris chain is a Markov chain where the chain returns to a particular part of the state space an unbounded number of times.[1] Harris chains are regenerative processes and are named after Theodore Harris.

Definition

A Markov chain {Xn} on state space Ω with stochastic kernel K is a Harris chain[2] if there exist A, B ⊆ Ω, ϵ > 0, and probability measure ρ with ρ(B) = 1 such that

  1. If τA := inf {n ≥ 0 : XnA}, then P(τA < ∞ | X0 = x) > 0 for all x ∈ Ω.
  2. If xA and CB then K(x, C) ≥ ερ(C).

In the case of a process with a countable state space, this technical definition can be rephrased as follows: given two points x1 and x2 in A, then there is at least an ϵ chance that they can be moved together to the same point at the next time step.

Another way to say it is that suppose that x and y are in A. Then at the next time step I first flip a Bernoulli with parameter ϵ. If it comes up one, I move the points to a point chosen using ρ. If it comes up zero, the points move independently, with x moving according to P(Xn+1 ∈ C | Xn = x) = K(x, C) − ερ(C) and y moving according to P(Yn+1C | Yn = y) = K(y, C) − ερ(C).

Examples

Example 1: Countable state space

Given a countable set S and a pair (A′, B′ ) satisfying (1) and (2) in the above definition, we can without loss of generality take B′ to be a single point b. Upon setting A = {b}, pick c such that K(b, c) > 0 and set B = {c}. Then, (1) and (2) hold with A and B as singletons.

Example 2: Chains with continuous densities

Let {Xn}, XnRd be a Markov chain with a kernel that is absolutely continuous with respect to Lebesgue measure:

K(x, dy) = K(x, y) dy

such that K(x, y) is a continuous function.

Pick (x0, y0) such that K(x0, y0 ) > 0, and let A and B be open sets containing x0 and y0 respectively that are sufficiently small so that K(x, y) ≥ ε > 0 on A × B. Letting ρ(C) = |B  C|/|B| where |B| is the Lebesgue measure of B, we have that (2) in the above definition holds. If (1) holds, then {Xn} is a Harris chain.

Reducibility and periodicity

In the following, R := inf {n ≥ 1 : XnA}; i.e. R is the first time after time 0 that the process enters region A.

Definition: If for all L(X0), P(R < ∞ | X0A) = 1, then the Harris chain is called recurrent.

Definition: A recurrent Harris chain Xn is aperiodic if ∃N, such that ∀nN, ∀L(X0), P(XnA | X0A) > 0.

Theorem: Let Xn be an aperiodic recurrent Harris chain with stationary distribution π. If P(R < ∞ | X0 = x) =1 then as n  ∞, distTV (L(Xn | X0 = x), π)  0.

References

  1. Asmussen, Søren (2003). "Further Topics in Renewal Theory and Regenerative Processes". Applied Probability and Queues. Stochastic Modelling and Applied Probability. 51. pp. 186–219. doi:10.1007/0-387-21525-5_7. ISBN 978-0-387-00211-8.
  2. R. Durrett. Probability: Theory and Examples. Thomson, 2005. ISBN 0-534-42441-4.
This article is issued from Wikipedia - version of the 1/8/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.