Markov model

(redirected from Markov decision-making model)
Also found in: Encyclopedia.

Markov model

A model used in decision analysis for evaluating potential outcomes of a disease process, which are defined as specific health states, transitions among which are modelled iteratively. In standard decision tree analysis, a patient moves through states—for example, from not treated, to treated, to final outcome; in a Markov process, a patient moves between states (e.g., backwards and forward between continuous ambulatory peritoneal dialysis and haemodialysis). Some states cannot be left once entered (so-called “absorbing states”), including death.

Markov chain, Markov model

a mathematical model that makes it possible to study complex systems by establishing a state of the system and then effecting a transition to a new state, such a transition being dependent only on the values of the current state, and not dependent on the previous history of the system up to that point.