Definitions from Wiktionary (mutual information)
▸ noun: (information theory) A measure of the entropic (informational) correlation between two random variables.
Pointwise mutual information,
Conditional mutual information,
Adjusted mutual information,
Quantum mutual information,
more...
▸ Words similar to mutual information
▸ Usage examples for mutual information
▸ Idioms related to mutual information
▸ Wikipedia articles (New!)
▸ Words that often appear near mutual information
▸ Rhymes of mutual information
▸ Invented words related to mutual information
▸ noun: (information theory) A measure of the entropic (informational) correlation between two random variables.
Similar:
conditional entropy,
information theory,
min-entropy,
Rényi entropy,
joint entropy,
infomax,
intervariance,
correntropy,
Shannon entropy,
perplexity,
more...
Phrases:
▸ Words similar to mutual information
▸ Usage examples for mutual information
▸ Idioms related to mutual information
▸ Wikipedia articles (New!)
▸ Words that often appear near mutual information
▸ Rhymes of mutual information
▸ Invented words related to mutual information