Markov Chain
Markov chain is a stochastic process ⚠
satisfying
⚠
The conditional probability of being in the state ⚠
after states depends only on the current state ⚠
and is defined by the transition matrix ⚠
. This property is called Markov property. If ⚠
the chain is called homogeneous. See also http://en.wikipedia.org/wiki/Markov_chain.