Markov Chains

Definition:

A Markov Chain is a Random Process, let's call itXX, with the following property: P(Xn+1=xn+1∣P(Xn=xn,Xn−1=xn−1,… )=P(Xn+1=xn+1∣Xn=xn)P(X_{n+1}=x_{n+1} \mid P(X_{n}=x_{n},X_{n-1}=x_{n-1},\dots)=P(X_{n+1}=x_{n+1}\mid X_{n}=x_{n}) In other words the value of Xn+1X_{n+1} is independent of all XtX_{t} such that t≤n−1t\le n-1.

A Markov Chain can be defined by three characteristics:

  • State Space - i.e.S={0,1…,M−1}S=\{0,1\dots,M-1\} with M possible states

  • Initial State Distribution - i.e.p(0)\textbf{p}(0)M dimensional row vector which sums to 1

  • Transition Probability - i.e. P\textbf{P}square M dimensional matrix where each row sums to 1

Sum I.I.D. processes (i.e. random walk) are examples of Markov Chains.

A moving average of I.I.D. values is not a Markov Chain.

Last updated

Was this helpful?