Statistical distance

Source: Wikipedia, the free encyclopedia.

In

samples
, or the distance can be between an individual sample point and a population or a wider sample of points.

A distance between populations can be interpreted as measuring the distance between two

statistical dependence,[1]
and hence these distances are not directly related to measures of distances between probability measures. Again, a measure of distance between random variables may relate to the extent of dependence between them, rather than to their individual values.

Many statistical distance measures are not

metrics, and some are not symmetric. Some types of distance measures, which generalize squared distance, are referred to as (statistical) divergences
.

Terminology

Many terms are used to refer to various notions of distance; these are often confusingly similar, and may be used inconsistently between authors and over time, either loosely or with precise technical meaning. In addition to "distance", similar terms include

information gain
.

Distances as metrics

Metrics

A metric on a set X is a function (called the distance function or simply distance) d : X × XR+ (where R+ is the set of non-negative real numbers). For all x, y, z in X, this function is required to satisfy the following conditions:

  1. d(x, y) ≥ 0     (
    non-negativity
    )
  2. d(x, y) = 0   if and only if   x = y     (identity of indiscernibles. Note that condition 1 and 2 together produce positive definiteness)
  3. d(x, y) = d(y, x)     (symmetry)
  4. d(x, z) ≤ d(x, y) + d(y, z)     (subadditivity / triangle inequality).

Generalized metrics

Many statistical distances are not

semimetrics violate property (4), the triangle inequality. Statistical distances that satisfy (1) and (2) are referred to as divergences
.

Statistically close

The total variation distance of two distributions and over a finite domain , (often referred to as statistical difference[2] or statistical distance[3] in cryptography) is defined as

.

We say that two

probability ensembles
and are statistically close if is a negligible function in .

Examples

Metrics

Divergences

See also

Notes

  1. ^ Dodge, Y. (2003)—entry for distance
  2. ^ .
  3. ^ Reyzin, Leo. (Lecture Notes) Extractors and the Leftover Hash Lemma

External links

References

  • Dodge, Y. (2003) Oxford Dictionary of Statistical Terms, OUP.