In dictionaries:
Markov chain
Stochastic model predicting state transitions.
Markov process
Stochastic process with memorylessness property.
markov chains
Stochastic model predicting state transitions.
Markov model
Statistical model predicting state transitions.
markov models
Markov property
(probability theory) The memoryless property of Markov models, according to which the future states depend only on the current state and not those that preceded it.
markov analysis
Statistical analysis using transition probabilities.
markov matrix
markov sequence
Markov algorithm
In theoretical computer science, a Markov algorithm is a string rewriting system that uses grammar-like rules to operate on strings of symbols.
Markov partition
(mathematics) A tool used in dynamical systems theory, allowing the methods of symbolic dynamics to be applied to the study of hyperbolic systems. By using a Markov partition, the system can be made to resemble a discrete-time Markov process, with the long-term dynamical characteristics of the system represented as a Markov shift.
markov inequality
andrei markov
Mathematician known for Markov chains.
more...