Markov chain

(redirected from Markov analysis)
Also found in: Dictionary, Thesaurus, Encyclopedia.

Markov,

(Markoff), Andrei, Russian mathematician, 1865-1922.
Markov chain - number of steps or events in sequence.
Markov chaining - a theory used in psychiatry.
Markov process - a process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
Medical Eponyms © Farlex 2012
References in periodicals archive ?
Specifically, we wanted to know if the Markov analysis method could detect measurable and generalizable differences between effective and ineffective students' use of basic counseling skills by examining the processes associated with their respective counseling sessions.
Wampold (1986) previously argued that the Markov analysis approach is only one of several sequential analyses that can be used.
Markov analysis is named after a Russian mathematician to whom to its development was attributed in 1907.
Markov analysis is one type of discrete time stochastic process, a sequence of random events for which the probability of each event is determined by the nature of the preceding event.
The nature of the problem seems ideally suited to the use of Markov analysis as it clearly involves probabilistic transitions from a set of known initial states.
Among their topics are initial considerations for reliability design, discrete and continuous random variables, modeling and reliability basics, the Markov analysis of repairable and non-repairable systems, Six Sigma tools for predictive engineering, a case study of updating reliability estimates, and complex high availability system analysis.
Coverage includes the basics of management science, foundational models, linear programming, duality, sensitivity analysis, computer solutions of linear programming, integer and zero-one programming, goal programming, transportation, network models, and nonlinear programming, along with such techniques as PERT and CPM project planning, decision and game theory, the analytical process, inventory models, queuing models, dynamic programming and simulation, forecasting, Markov analysis, and relations to information systems.