In dictionaries:
Discrete-time Markov chain
(mathematics, probability theory) A sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past.
discrete time markov chain
Markov chain Monte Carlo
In statistics, Markov chain Monte Carlo is a class of algorithms used to draw samples from a probability distribution.
Continuous-time Markov chain
A continuous-time Markov chain is a continuous stochastic process in which, for each state, the process will change state according to an exponential random variable and then move to a different state as specified by the probabilities of a stochastic matrix.
Absorbing Markov chain
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state.
Markov chain mixing time
In probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution.
Markov chain central limit theorem
In the mathematical theory of random processes, the Markov chain central limit theorem has a conclusion somewhat similar in form to that of the classic central limit theorem of probability theory, but the quantity in the role taken by the variance in the classic CLT has a more complicated definition.
more...