Markov chain

(redirected from Discrete-time Markov chain)
Also found in: Dictionary, Thesaurus, Acronyms, Encyclopedia.

Markov,

(Markoff), Andrei, Russian mathematician, 1865-1922.
Markov chain - number of steps or events in sequence.
Markov chaining - a theory used in psychiatry.
Markov process - a process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.
References in periodicals archive ?
One open problem is to show that, in a discrete-time Markov chain with 'local' transitions, under suitable conditions, rapid mixing occurs essentially if and only if there is normal concentration of measure long-term and in equilibrium (with non-trivial bounds).
3 shows the proposed discrete-time Markov chain model of the IEEE 802.15.6 MAC protocol in the unsaturated condition.
It is easy to model the process {i(t), s(t), b(t)} with discrete-time Markov chain in the assumption that [p.sub.i] (collision probability) and [d.sub.n] (the channel busy probability during its backoff stage) are independent.
In [2], based on the assumption that collision probability is independent from the transmission history, a two-dimensional discrete-time Markov chain model represented as a stochastic process (s(t),b(t)) is defined, where s(t) is the backoff stage i (0,1,...,m)at time t and b(t) is the backoff counter value k (0,1,..., [W.sup.i] - 1)at time t.
It supports analysis of several types of probabilistic models: discrete-time Markov chains (DTMCs), continuous-time Markov chains (CTMCs), Markov decision processes (MDPs), probabilistic automata, and probabilistic timed automata.

Full browser ?