Heinz mean

Source: Wikipedia, the free encyclopedia.

In mathematics, the Heinz mean (named after E. Heinz[1]) of two non-negative real numbers A and B, was defined by Bhatia[2] as:

with 0 ≤ x ≤ 1/2.

For different values of x, this Heinz mean interpolates between the arithmetic (x = 0) and geometric (x = 1/2) means such that for 0 < x < 1/2:

The Heinz means appear naturally when symmetrizing -divergences.[3]

It may also be defined in the same way for

positive semidefinite matrices, and satisfies a similar interpolation formula.[4][5]

See also

References

  1. ^ E. Heinz (1951), "Beiträge zur Störungstheorie der Spektralzerlegung", Math. Ann., 123, pp. 415–438.
  2. .
  3. ^ Nielsen, Frank; Nock, Richard; Amari, Shun-ichi (2014), "On Clustering Histograms with k-Means by Using Mixed α-Divergences", Entropy, 16 (6): 3273–3301, .
  4. .
  5. .