Probability distribution
Part of a series on statistics |
Probability theory |
---|
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment.[1][2] It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events (subsets of the sample space).[3]
For instance, if X is used to denote the outcome of a coin toss ("the experiment"), then the probability distribution of X would take the value 0.5 (1 in 2 or 1/2) for X = heads, and 0.5 for X = tails (assuming that the coin is fair). More commonly, probability distributions are used to compare the relative occurrence of many different random values.
Probability distributions can be defined in different ways and for discrete or for continuous variables. Distributions with special properties or for especially important applications are given specific names.
Introduction
A probability distribution is a mathematical description of the probabilities of events, subsets of the sample space. The sample space, often represented in notation by is the
To define probability distributions for the specific case of
In contrast, when a random variable takes values from a continuum then typically, any individual outcome has probability zero and only events that include infinitely many outcomes, such as intervals, can have positive probability. For example, consider measuring the weight of a piece of ham in the supermarket, and assume the scale has many digits of precision. The probability that it weighs exactly 500 g is zero, as it will most likely have some non-zero decimal digits. Nevertheless, one might demand, in quality control, that a package of "500 g" of ham must weigh between 490 g and 510 g with at least 98% probability, and this demand is less sensitive to the accuracy of measurement instruments.
Absolutely continuous probability distributions can be described in several ways. The
General probability definition
A probability distribution can be described in various forms, such as by a probability mass function or a cumulative distribution function. One of the most general descriptions, which applies for absolutely continuous and discrete variables, is by means of a probability function whose input space is a σ-algebra, and gives a real number probability as its output, particularly, a number in .
The probability function can take as argument subsets of the sample space itself, as in the coin toss example, where the function was defined so that P(heads) = 0.5 and P(tails) = 0.5. However, because of the widespread use of
The above probability function only characterizes a probability distribution if it satisfies all the
- , so the probability is non-negative
- , so no probability exceeds
- for any countable disjoint family of sets
The concept of probability function is made more rigorous by defining it as the element of a probability space , where is the set of possible outcomes, is the set of all subsets whose probability can be measured, and is the probability function, or probability measure, that assigns a probability to each of these measurable subsets .[9]
Probability distributions usually belong to one of two classes. A discrete probability distribution is applicable to the scenarios where the set of possible outcomes is
A probability distribution whose sample space is one-dimensional (for example real numbers, list of labels, ordered labels or binary) is called
Besides the probability function, the cumulative distribution function, the probability mass function and the probability density function, the
Terminology
Some key concepts and terms, widely used in the literature on the topic of probability distributions, are listed below.[1]
Basic terms
- Random variable: takes values from a sample space; probabilities describe which values and set of values are taken more likely.
- Event: set of possible values (outcomes) of a random variable that occurs with a certain probability.
- Probability function or probability measure: describes the probability that the event occurs.[11]
- Cumulative distribution function: function evaluating the probability that will take a value less than or equal to for a random variable (only for real-valued random variables).
- Quantile function: the inverse of the cumulative distribution function. Gives such that, with probability , will not exceed .
Discrete probability distributions
- Discrete probability distribution: for many random variables with finitely or countably infinitely many values.
- Probability mass function (pmf): function that gives the probability that a discrete random variable is equal to some value.
- Frequency distribution: a table that displays the frequency of various outcomes in a sample.
- sample(i.e. sample size).
- Categorical distribution: for discrete random variables with a finite set of values.
Absolutely continuous probability distributions
- Absolutely continuous probability distribution: for many random variables with uncountably many values.
- Probability density function (pdf) or probability density: function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can be interpreted as providing a relative likelihood that the value of the random variable would equal that sample.
Related terms
- Support: set of values that can be assumed with non-zero probability (or probability density in the case of a continuous distribution) by the random variable. For a random variable , it is sometimes denoted as .
- Tail:[12] the regions close to the bounds of the random variable, if the pmf or pdf are relatively low therein. Usually has the form , or a union thereof.
- Head:[12] the region where the pmf or pdf is relatively high. Usually has the form .
- weighted averageof the possible values, using their probabilities as their weights; or the continuous analog thereof.
- Median: the value such that the set of values less than the median, and the set greater than the median, each have probabilities no greater than one-half.
- Mode: for a discrete random variable, the value with highest probability; for an absolutely continuous random variable, a location at which the probability density function has a local peak.
- Quantile: the q-quantile is the value such that .
- Variance: the second moment of the pmf or pdf about the mean; an important measure of the dispersion of the distribution.
- Standard deviation: the square root of the variance, and hence another measure of dispersion.
- Symmetry: a property of some distributions in which the portion of the distribution to the left of a specific value (usually the median) is a mirror image of the portion to its right.
- Skewness: a measure of the extent to which a pmf or pdf "leans" to one side of its mean. The third standardized moment of the distribution.
- Kurtosis: a measure of the "fatness" of the tails of a pmf or pdf. The fourth standardized moment of the distribution.
Cumulative distribution function
In the special case of a real-valued random variable, the probability distribution can equivalently be represented by a cumulative distribution function instead of a probability measure. The cumulative distribution function of a random variable with regard to a probability distribution is defined as
The cumulative distribution function of any real-valued random variable has the properties:
- is non-decreasing;
- is right-continuous;
- ;
- and ; and
- .
Conversely, any function that satisfies the first four of the properties above is the cumulative distribution function of some probability distribution on the real numbers.[13]
Any probability distribution can be decomposed as the
Discrete probability distribution
A discrete probability distribution is the probability distribution of a random variable that can take on only a countable number of values[15] (almost surely)[16] which means that the probability of any event can be expressed as a (finite or countably infinite) sum:
Well-known discrete probability distributions used in statistical modeling include the
Cumulative distribution function
A real-valued discrete random variable can equivalently be defined as a random variable whose cumulative distribution function increases only by
The points where the cdf jumps always form a countable set; this may be any countable set and thus may even be dense in the real numbers.
Dirac delta representation
A discrete probability distribution is often represented with Dirac measures, the probability distributions of deterministic random variables. For any outcome , let be the Dirac measure concentrated at . Given a discrete probability distribution, there is a countable set with and a probability mass function . If is any event, then
Similarly, discrete distributions can be represented with the Dirac delta function as a generalized probability density function , where
Indicator-function representation
For a discrete random variable , let be the values it can take with non-zero probability. Denote
These are
It follows that the probability that takes any value except for is zero, and thus one can write as
except on a set of probability zero, where is the indicator function of . This may serve as an alternative definition of discrete random variables.
One-point distribution
A special case is the discrete distribution of a random variable that can take on only one fixed value; in other words, it is a
Absolutely continuous probability distribution
An absolutely continuous probability distribution is a probability distribution on the real numbers with uncountably many possible values, such as a whole interval in the real line, and where the probability of any event can be expressed as an integral.[19] More precisely, a real random variable has an
An absolutely continuous random variable is a random variable whose probability distribution is absolutely continuous.
There are many examples of absolutely continuous probability distributions:
Cumulative distribution function
Absolutely continuous probability distributions as defined above are precisely those with an absolutely continuous cumulative distribution function. In this case, the cumulative distribution function has the form
Note on terminology: Absolutely continuous distributions ought to be distinguished from continuous distributions, which are those having a continuous cumulative distribution function. Every absolutely continuous distribution is a continuous distribution but the inverse is not true, there exist
For a more general definition of density functions and the equivalent absolutely continuous measures see
Kolmogorov definition
In the
Other kinds of distributions
Absolutely continuous and discrete distributions with support on or are extremely useful to model a myriad of phenomena,
One example is shown in the figure to the right, which displays the evolution of a
This kind of complicated support appears quite frequently in
Note that even in these cases, the probability distribution, if it exists, might still be termed "absolutely continuous" or "discrete" depending on whether the support is uncountable or countable, respectively.
Random number generation
Most algorithms are based on a pseudorandom number generator that produces numbers that are uniformly distributed in the
For example, suppose has a uniform distribution between 0 and 1. To construct a random Bernoulli variable for some , we define
This random variable X has a Bernoulli distribution with parameter .[29] This is a transformation of discrete random variable.
For a distribution function of an absolutely continuous random variable, an absolutely continuous random variable must be constructed. , an inverse function of , relates to the uniform variable :
For example, suppose a random variable that has an exponential distribution must be constructed.
A frequent problem in statistical simulations (the Monte Carlo method) is the generation of pseudo-random numbers that are distributed in a given way.
Common probability distributions and their applications
The concept of the probability distribution and the random variables which they describe underlies the mathematical discipline of probability theory, and the science of statistics. There is spread or variability in almost any value that can be measured in a population (e.g. height of people, durability of a metal, sales growth, traffic flow, etc.); almost all measurements are made with some intrinsic error; in physics, many processes are described probabilistically, from the
The following is a list of some of the most common probability distributions, grouped by the type of process that they are related to. For a more complete list, see list of probability distributions, which groups by the nature of the outcome being considered (discrete, absolutely continuous, multivariate, etc.)
All of the univariate distributions below are singly peaked; that is, it is assumed that the values cluster around a single point. In practice, actually observed quantities may cluster around multiple values. Such quantities can be modeled using a mixture distribution.
Linear growth (e.g. errors, offsets)
- Normal distribution (Gaussian distribution), for a single such quantity; the most commonly used absolutely continuous distribution
Exponential growth (e.g. prices, incomes, populations)
- Log-normal distribution, for a single such quantity whose log is normally distributed
- Pareto distribution, for a single such quantity whose log is exponentially distributed; the prototypical power law distribution
Uniformly distributed quantities
- Discrete uniform distribution, for a finite set of values (e.g. the outcome of a fair dice)
- Continuous uniform distribution, for absolutely continuously distributed values
Bernoulli trials (yes/no events, with a given probability)
- Basic distributions:
- Bernoulli distribution, for the outcome of a single Bernoulli trial (e.g. success/failure, yes/no)
- independentoccurrences
- Negative binomial distribution, for binomial-type observations but where the quantity of interest is the number of failures before a given number of successes occurs
- Geometric distribution, for binomial-type observations but where the quantity of interest is the number of failures before the first success; a special case of the negative binomial distribution
- Related to sampling schemes over a finite population:
- sampling without replacement
- sampling without replacement)
Categorical outcomes (events with K possible outcomes)
- Categorical distribution, for a single categorical outcome (e.g. yes/no/maybe in a survey); a generalization of the Bernoulli distribution
- Multinomial distribution, for the number of each type of categorical outcome, given a fixed number of total outcomes; a generalization of the binomial distribution
- sampling without replacement; a generalization of the hypergeometric distribution
Poisson process (events that occur independently with a given rate)
- Poisson distribution, for the number of occurrences of a Poisson-type event in a given period of time
- Exponential distribution, for the time before the next Poisson-type event occurs
- Gamma distribution, for the time before the next k Poisson-type events occur
Absolute values of vectors with normally distributed components
- Rayleigh distribution, for the distribution of vector magnitudes with Gaussian distributed orthogonal components. Rayleigh distributions are found in RF signals with Gaussian real and imaginary components.
- Rice distribution, a generalization of the Rayleigh distributions for where there is a stationary background signal component. Found in Rician fading of radio signals due to multipath propagation and in MR images with noise corruption on non-zero NMR signals.
Normally distributed quantities operated with sum of squares
- sample variance of normally distributed samples (see chi-squared test)
- chi squared variable; useful for inference regarding the mean of normally distributed samples with unknown variance (see Student's t-test)
- correlation coefficient)
As conjugate prior distributions in Bayesian inference
- Beta distribution, for a single probability (real number between 0 and 1); conjugate to the Bernoulli distribution and binomial distribution
- Gamma distribution, for a non-negative scaling parameter; conjugate to the rate parameter of a Poisson distribution or exponential distribution, the precision (inverse variance) of a normal distribution, etc.
- Dirichlet distribution, for a vector of probabilities that must sum to 1; conjugate to the categorical distribution and multinomial distribution; generalization of the beta distribution
- non-negative definite matrix; conjugate to the inverse of the covariance matrix of a multivariate normal distribution; generalization of the gamma distribution[30]
Some specialized applications of probability distributions
- The statistical language models used in natural language processingto assign probabilities to the occurrence of particular words and word sequences do so by means of probability distributions.
- In quantum mechanics, the probability density of finding the particle at a given point is proportional to the square of the magnitude of the particle's wavefunction at that point (see Born rule). Therefore, the probability distribution function of the position of a particle is described by , probability that the particle's position x will be in the interval a ≤ x ≤ b in dimension one, and a similartriple integral in dimension three. This is a key principle of quantum mechanics.[31]
- Probabilistic load flow in power-flow study explains the uncertainties of input variables as probability distribution and provides the power flow calculation also in term of probability distribution.[32]
- Prediction of natural phenomena occurrences based on previous frequency distributions such as tropical cyclones, hail, time in between events, etc.[33]
Fitting
Probability distribution fitting or simply distribution fitting is the fitting of a probability distribution to a series of data concerning the repeated measurement of a variable phenomenon. The aim of distribution fitting is to predict the probability or to forecast the frequency of occurrence of the magnitude of the phenomenon in a certain interval.
There are many probability distributions (see list of probability distributions) of which some can be fitted more closely to the observed frequency of the data than others, depending on the characteristics of the phenomenon and of the distribution. The distribution giving a close fit is supposed to lead to good predictions.
In distribution fitting, therefore, one needs to select a distribution that suits the data well.See also
- Conditional probability distribution
- Empirical probability distribution
- Histogram
- Joint probability distribution
- Probability measure
- Quasiprobability distribution
- Riemann–Stieltjes integral application to probability theory
Lists
- List of probability distributions
- List of statistical topics
References
Citations
- ^ OCLC 161828328.
- OCLC 190785258.
- ^ OCLC 473463742.
- ^ a b "1.3.6.1. What is a Probability Distribution". www.itl.nist.gov. Retrieved 2020-09-10.
- ^ )
- ^ Walpole, R.E.; Myers, R.H.; Myers, S.L.; Ye, K. (1999). Probability and statistics for engineers. Prentice Hall.
- ^ a b c d Ross, Sheldon M. (2010). A first course in probability. Pearson.
- ^ a b DeGroot, Morris H.; Schervish, Mark J. (2002). Probability and Statistics. Addison-Wesley.
- ISBN 9780471804789.
- S2CID 14668369.
- ^ Chapters 1 and 2 of Vapnik (1998)
- ^ Long-tailed distribution, fat-tailed distribution
- ISBN 9780387878584.
- ^ see Lebesgue's decomposition theorem
- OCLC 710149819.
- ^ Cohn, Donald L. (1993). Measure theory. Birkhäuser.
- S2CID 122501973.
- ISBN 0-471-26250-1.
- ^ Jeffrey Seth Rosenthal (2000). A First Look at Rigorous Probability Theory. World Scientific.
- ^ Chapter 3.2 of DeGroot & Schervish (2002)
- ^ Bourne, Murray. "11. Probability Distributions - Concepts". www.intmath.com. Retrieved 2020-09-10.
- )
- ^ Kolmogorov, Andrey (1950) [1933]. Foundations of the theory of probability. New York, USA: Chelsea Publishing Company. pp. 21–24.
- ^ Joyce, David (2014). "Axioms of Probability" (PDF). Clark University. Retrieved December 5, 2019.
- ^ a b Alligood, K.T.; Sauer, T.D.; Yorke, J.A. (1996). Chaos: an introduction to dynamical systems. Springer.
- Bibcode:1979JETP...50..311R.
- ^ Section 1.9 of Ross, S.M.; Peköz, E.A. (2007). A second course in probability (PDF).
- ^ Walters, Peter (2000). An Introduction to Ergodic Theory. Springer.
- ^ ISBN 978-1-85233-896-1
- OCLC 71008143.
- )
- S2CID 18669309.
- OCLC 1038418263.)
{{cite book}}
: CS1 maint: location missing publisher (link
Sources
- den Dekker, A. J.; Sijbers, J. (2014). "Data distributions in magnetic resonance images: A review". PMID 25059432.
- Vapnik, Vladimir Naumovich (1998). Statistical Learning Theory. John Wiley and Sons.
External links
- "Probability distribution", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
- Field Guide to Continuous Probability Distributions, Gavin E. Crooks.
- Distinguishing probability measure, function and distribution, Math Stack Exchange