Markov chain

Also found in: Dictionary, Thesaurus, Acronyms, Encyclopedia, Wikipedia.


(Markoff), Andrei, Russian mathematician, 1865-1922.
Markov chain - number of steps or events in sequence.
Markov chaining - a theory used in psychiatry.
Markov process - a process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
Medical Eponyms © Farlex 2012
References in periodicals archive ?
To detect anomalous activitys for the inside threateners and classify the same, Markov Chain Model has been applied in the present this paper.
where t represents the order of the Markov chain. Here, the transition probability matrix for various existing drought classes at the previous i time step is represented as
The deterministic description of the 5-state Markov chain model for exocytosis is given by the following ODE system:
Through a novel mapping, Markovian switching topologies are governed by a set of Markov chains to the edges of the graph.
the probability that the Markov chain moves from state i to state j, in h steps and the matrix
If [DELTA]r = 1, the state of the Markov chain is the set {1,2, ..., N}.
In particular, we consider a system with K = 6 queues, each having capacity C = 10, which results in a Markov chain with M = 1.771.561 states.
There are at least two reasons for us to select n-order Markov chain model.
The Markov chain for parameter [k.sub.6] also does not reach its stationary distribution (Figure 8).
The application developed from the mathematical model based on the Markov chain theory allows for the reduction of the distance travelled by the transfer- transport system and of its functioning time at the moment of loading and unloading of parts.
It normalizes the each step length autocorrelation coefficient, that [w.sub.k] regards as various lag time (step length) Markov chain weight (m is the maximum order).