Definitions from Wiktionary (stationary distribution)
▸ noun: (mathematics, stochastic processes, of a Markov chain) a row vector π whose entries sum to 1 that satisfies the equation πP=π, where P is the transition matrix of the Markov chain.
▸ Words similar to stationary distribution
▸ Usage examples for stationary distribution
▸ Idioms related to stationary distribution
▸ Wikipedia articles (New!)
▸ Words that often appear near stationary distribution
▸ Rhymes of stationary distribution
▸ Invented words related to stationary distribution
▸ noun: (mathematics, stochastic processes, of a Markov chain) a row vector π whose entries sum to 1 that satisfies the equation πP=π, where P is the transition matrix of the Markov chain.
Similar:
right stochastic matrix,
Markov chain,
transition matrix,
stationarity,
stochastic matrix,
stochastic process,
Markov process,
Markov partition,
Markov jump process,
ensemble,
more...
Opposite:
▸ Words similar to stationary distribution
▸ Usage examples for stationary distribution
▸ Idioms related to stationary distribution
▸ Wikipedia articles (New!)
▸ Words that often appear near stationary distribution
▸ Rhymes of stationary distribution
▸ Invented words related to stationary distribution