# Markov model

(redirected from*Markov chains*)

Also found in: Dictionary, Thesaurus, Encyclopedia.

## Markov model

A model used in decision analysis for evaluating potential outcomes of a disease process, which are defined as specific health states, transitions among which are modelled iteratively. In standard decision tree analysis, a patient moves through states—for example, from not treated, to treated, to final outcome; in a Markov process, a patient moves between states (e.g., backwards and forward between continuous ambulatory peritoneal dialysis and haemodialysis). Some states cannot be left once entered (so-called “absorbing states”), including death.## Markov chain, Markov model

a mathematical model that makes it possible to study complex systems by establishing a state of the system and then effecting a transition to a new state, such a transition being dependent only on the values of the current state, and not dependent on the previous history of the system up to that point.

Want to thank TFD for its existence? Tell a friend about us, add a link to this page, or visit the webmaster's page for free fun content.

Link to this page: