Markov Chain

Markov chain is a stochastic process ⚠ $x_i$ satisfying

⚠ $\displaystyle{p(x_i | x_{i-1},\ldots,x_1) = T(x_i,x_{i-1}).}$

The conditional probability of being in the state ⚠ $x_i$ after states depends only on the current state ⚠ $x_{i-1}$ and is defined by the transition matrix ⚠ $T$. This property is called Markov property. If ⚠ $T(x_i|x_j) = T(x_j|x_i)$ the chain is called homogeneous. See also http://en.wikipedia.org/wiki/Markov_chain.