Also found in: Dictionary, Thesaurus, Acronyms, Encyclopedia, Wikipedia.
Markov,(Markoff), Andrei, Russian mathematician, 1865-1922.
Markov chain - number of steps or events in sequence.
Markov chaining - a theory used in psychiatry.
Markov process - a process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
Markov chain, Markov model
a mathematical model that makes it possible to study complex systems by establishing a state of the system and then effecting a transition to a new state, such a transition being dependent only on the values of the current state, and not dependent on the previous history of the system up to that point.