Markov Chains
Definition:
A Markov Chain can be defined by three characteristics:
Sum I.I.D. processes (i.e. random walk) are examples of Markov Chains.

A moving average of I.I.D. values is not a Markov Chain.

Last updated