Definition:
A Markov Chain is a Random Process, let's call itX, with the following property: P(Xn+1​=xn+1​∣P(Xn​=xn​,Xn−1​=xn−1​,…)=P(Xn+1​=xn+1​∣Xn​=xn​)
In other words the value of Xn+1​ is independent of all Xt​ such that t≤n−1.
A Markov Chain can be defined by three characteristics:
State Space - i.e.S={0,1…,M−1} with M possible states
Initial State Distribution - i.e.p(0)M dimensional row vector which sums to 1
Transition Probability - i.e. Psquare M dimensional matrix where each row sums to 1
Sum I.I.D. processes (i.e. random walk) are examples of Markov Chains.
A moving average of I.I.D. values is not a Markov Chain.