Markov Chains
Definition:
A Markov Chain is a Random Process, let's call it, with the following property: In other words the value of is independent of all such that .
A Markov Chain can be defined by three characteristics:
State Space - i.e. with M possible states
Initial State Distribution - i.e.M dimensional row vector which sums to 1
Transition Probability - i.e. square M dimensional matrix where each row sums to 1
Sum I.I.D. processes (i.e. random walk) are examples of Markov Chains.

A moving average of I.I.D. values is not a Markov Chain.

Last updated
Was this helpful?