Markov Chain

Markov chain is a stochastic process xi satisfying

p(xixi-1,,x1)=T(xi,xi-1).

The conditional probability of being in the state xi after states depends only on the current state xi-1 and is defined by the transition matrix T. This property is called Markov property. If T(xixj)=T(xjxi) the chain is called homogeneous. See also http://en.wikipedia.org/wiki/Markov_chain.