Concept cluster: Math and astronomy > Stochastic processes
n
(statistics mnemonic) The rule stating that a normal distribution will have 68% of its observations within one standard deviation of the mean, 95% within two, and 99.7% within three.
adj
(mathematics) As measured using an absolute value.
adj
(mathematics) Describing a stochastic process that has a negative correlation between its increments.
adj
(mathematics) That becomes increasingly unstable
adj
(mathematics) Any, out of all that are possible.
n
(mathematics) An autoregressive process that is used to model many types of natural behaviour
n
(statistics) A function on a stochastic process whose value on any random variable is the preceding random variable in the process.
n
(probability theory) A theorem expressed as an equation that describes the conditional probability of an event or state given prior knowledge of another event.
n
(statistics) A series of independent trials, analogous to repeatedly flipping a coin to determine whether it is fair.
n
A pseudorandom number sampling method for generating pairs of independent, standard, normally distributed (zero expectation, unit variance) random numbers, given a source of uniformly distributed random numbers.
n
(statistics and mathematics, singular only) The theorem that states that if the sum of independent identically distributed random variables has a finite variance, then it will be approximately normally distributed.
adj
(mathematics) (of two or more numbers) Divisible by the same number แต‚แดพ
adj
(mathematics) Preserving the sign.
n
(statistics) A range defined such that there is a fixed probability that the value of the measured parameter will fall within the range.
n
The property of being coprime.
n
(mathematics, statistics) A probability distribution; the set of relative likelihoods that a variable will have a value in a given interval.
adj
(statistics) Free of specific assumptions about the shape of the probability distribution generating the sample.
adj
(mathematics) Of two or more series: such that their difference is summable.
n
(mathematics) A function of a random sample of a population used to estimate some parameter of the whole population
adv
(mathematics, of a sequence) For some tail; for all terms beyond some term; with only finitely many exceptions.
n
(mathematics) The probability that a certain outcome will occur, as determined through experiment.
n
(statistics) A Markov chain Monte Carlo algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is difficult.
n
(probability, statistics) A measure used to quantify the similarity between two probability distributions.
adj
(mathematics) Of or pertaining to an infimum.
n
(mathematics) The characteristics shared by all integers.
n
(mathematical analysis) The quality of being integrable (having an antidifference or antiderivative).
adj
(mathematics) Of, pertaining to, or being an integer.
n
An invariant quantity, function etc.
adj
Describing points on a graph that represent the same probability of some event
n
(probability theory) A formula used to determine the optimal size of a series of bets in order to maximize the logarithm of wealth.
n
The infimum of the set of accumulation points of a given sequence or set.
n
The projection of a joint probability distribution onto one of its component random variables.
n
(probability theory) A discrete-time stochastic process with the Markov property.
n
(mathematics) A time-dependent variable that starts in an initial state and stays in that state for a random time, when it makes a transition to another random state, and so on.
n
A stochastic model for randomly changing systems, where it is assumed that future states depend only on the current state and not those that preceded it.
n
(probability theory) Any stochastic process for which the conditional probability distribution of future states depends only on the current state (and not on past states).
n
(probability theory) The memoryless property of Markov models, according to which the future states depend only on the current state and not those that preceded it.
adj
(statistics, of a process) Exhibiting the Markov property, in which the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states.
n
(mathematics) A stochastic process for which the conditional expectation of future values given the sequence of all prior values is equal to the current value.
adj
(mathematics) Describing a certain class of nonnegative functions of countable disjoint sets
adj
(statistics) Markov chain Monte Carlo
adj
(probability theory) Of a probability distribution, such that any derived probability from a set of random samples is distinct and has no information (i.e. "memory") of earlier samples.
adj
(mathematics) minimum universal
adj
(mathematics) Of or related to a minor, a determinate obtained by deleting one or more rows and columns from a matrix.
n
(mathematics) A function, all of whose values are not greater than the corresponding values of another function
adj
(statistics) Of a stochastic process, to be asymptotically independent (in a precise mathematical sense).
n
(mathematics, statistics) Any of a class of techniques for estimating the solution of a numerical mathematical problem by means of a random artificial sampling experiment that simulates the problem.
n
(game theory) The set of choices of players' strategies for which no player can benefit by changing his or her strategy assuming that the other players keep theirs unchanged.
n
(statistics) neutrosophic probability or statistics
n
(mathematics) A sequence of zero elements
adj
(psychology, of a number) Having a sharp or precise numeric value; having not been rounded.
n
(statistics) Any of a family of continuous probability distributions such that the probability density function is the Gaussian function
n
(probability theory, statistics) A random variable whose probability distribution is a normal distribution.
n
(countable, mathematics, statistics) A measure of how well an observed distribution approximates a normal distribution.
adj
(mathematics, stochastic processes, of a state) persistent with infinite expected return time.
adj
(mathematics, stochastic processes, of a state) Recurrent with infinite expected return time.
n
(stochastics) A realization of a random variable.
n
(mathematics, diminutive) An odd number.
n
(statistics) A probability distribution such that for a random variable X with that distribution holds that the probability that X is greater than some number x is given by
adj
(computer science) describing a property that holds only when an algorithm terminates
adj
(mathematics, stochastic processes, of a state) For which any return to it must occur in multiples of k time steps, for some k>1.
n
(mathematics) The number of times an operation can be iteratively applied to a number before it reaches a permanently constant state.
adj
(mathematics) Describing a fractal process that has a positive Brown function
n
(statistics) Any of a class of discrete probability distributions that express the probability of a given number of events occurring in a fixed time interval, where the events occur independently and at a constant average rate; describable as a limit case of either binomial or negative binomial distributions.
n
(mathematics, statistics, probability theory) A stochastic process in which events occur continually and independently of one another.
n
(mathematics) A branch of statistics that deals with probabilities
n
(mathematics) A number, between 0 and 1, expressing the precise likelihood of an event happening.
n
(mathematics) The mathematical study of probability (the likelihood of occurrence of random events in order to predict the behavior of defined systems).
adj
(mathematics) Being strictly part of some other thing (not necessarily explicitly mentioned, but of definitional importance), and not being the thing itself.
adj
Of a sequence of numbers, such that it has all the properties of a random sequence following some probability distribution (except true randomness), but is actually generated using a deterministic algorithm.
adj
(mathematics) Of or relating to probability distribution.
n
(mathematics, computing) A function whose value is chosen at random from its domain
n
(statistics, broadly) A quantity whose value is random and to which a probability distribution is assigned, such as the possible outcome of a roll of a dice.
adj
(mathematics, stochastic processes, of a state) Non-transient.
n
(computational statistics, numerical analysis) A technique used to generate observations from a distribution by repeatedly sampling an approximation function that bounds the distribution everywhere from above and rejecting the samples based on the quotient of the two functions at the sample's location.
n
A branch of probability theory that generalizes the compound Poisson process for arbitrary holding times.
n
A formula used to estimate underlying probabilities when there are few observations, or for events that have not been observed to occur at all in (finite) sample data.
n
(statistics) Any set of possible values to which the appropriate random variables determined by a stochastic process might map a given point in the sample space, taken over all values of the index space (often regarded as time).
n
(probability theory) The set of all possible outcomes of a game, experiment or other situation.
adj
(mathematics) Having either an upper or a lower bound, but not both
n
(mathematics) The condition of being a semiprime
n
(statistics) A measure of how likely a mean of means is to be wrong. It is the standard deviation of the sample sizes in each to the sample means.
n
(statistics) The normal distribution with a mean of zero and a variance of one.
n
(probability theory, statistics) A random variable whose probability distribution is a standard normal distribution.
n
(mathematics, stochastic processes) An element of the range of the random variables that define a random process.
n
(mathematics, stochastic processes, of a Markov chain) a row vector ๐œ‹ whose entries sum to 1 that satisfies the equation ๐œ‹P=๐œ‹, where P is the transition matrix of the Markov chain.
adj
(mathematics) Of an equation: for which certain numerical solving methods are numerically unstable, unless the step size is taken to be extremely small.
n
(mathematics, probability theory) A function that maps elements of an index set to elements of a collection of random variables; historically, a collection of random variables indexed by a set of numbers usually regarded as points in time.
n
(mathematics) The branch of statistics that deals with stochastic systems
n
(mathematics) A random variable representing the time required for a stochastic process to satisfy a certain condition, which must be determined using only information available up to such time and must occur almost surely in finite time.
adv
(mathematics) In a manner that applies to every member of a set or every interval of a function
adj
(mathematics) numerically less than a maximum or minimum
n
(mathematics) A stochastic process for which the conditional expectation of future values given the sequence of all prior values is bounded above by the current value.
n
(statistics) Ellipsis of Student's t distribution. [(statistics) A distribution that arises when the population standard deviation is unknown and has to be estimated from the data.]
n
(statistics) Ellipsis of Student's t-distribution.
n
(probability) an event associated with a given infinite sequence Y=y_k|k>k_0 that is determined by any subsequence of the form Y_n=y_k|k>n,n>k_0 (a "tail" of that sequence)
n
(ergodic theory) The average (if it exists) of a function over iterations of T, a measure-preserving transformation on a measure space, starting from some initial point x.
adj
(mathematics, stochastic processes, of a state) having a positive probability of being left and never being visited again.
adj
(statistics) Having a covariance of zero
n
(biology) The presence of a roughly equal distance between specimens of a particular species within their normal species range.
n
(mathematics) A quantity that may assume any one of a set of values.
n
Any of various probability distributions used to model the amount of time for which something can be used until it ceases to be operable.
adj
(mathematics, not comparable) Having a well-order.
adj
(mathematics) Having a unique solution whose value changes only slightly if initial conditions change slightly

Note: Concept clusters like the one above are an experimental OneLook feature. We've grouped words and phrases into thousands of clusters based on a statistical analysis of how they are used in writing. Some of the words and concepts may be vulgar or offensive. The names of the clusters were written automatically and may not precisely describe every word within the cluster; furthermore, the clusters may be missing some entries that you'd normally associate with their names. Click on a word to look it up on OneLook.
  Reverse Dictionary / Thesaurus   Datamuse   Compound Your Joy   Threepeat   Spruce   Feedback   Dark mode   Help


Our daily word games Threepeat and Compound Your Joy are going strong. Bookmark and enjoy!

Today's secret word is 8 letters and means "Believable and worthy of trust." Can you find it?