Hankel matrix

Source: Wikipedia, the free encyclopedia.

In linear algebra, a Hankel matrix (or catalecticant matrix), named after Hermann Hankel, is a square matrix in which each ascending skew-diagonal from left to right is constant. For example,

More generally, a Hankel matrix is any matrix of the form

In terms of the components, if the element of is denoted with , and assuming , then we have for all

Properties

  • Any Hankel matrix is symmetric.
  • Let be the exchange matrix. If is an Hankel matrix, then where is an Toeplitz matrix.
    • If is real symmetric, then will have the same
      eigenvalues
      as up to sign.[1]
  • The Hilbert matrix is an example of a Hankel matrix.
  • The determinant of a Hankel matrix is called a catalecticant.

Hankel operator

Given a

formal Laurent series
the corresponding Hankel operator is defined as[2]

This takes a polynomial and sends it to the product , but discards all powers of with a non-negative exponent, so as to give an element in , the formal power series with strictly negative exponents. The map is in a natural way -linear, and its matrix with respect to the elements and is the Hankel matrix

Any Hankel matrix arises in this way. A theorem due to Kronecker says that the rank of this matrix is finite precisely if is a rational function; that is, a fraction of two polynomials

Approximations

We are often interested in approximations of the Hankel operators, possibly by low-order operators. In order to approximate the output of the operator, we can use the spectral norm (operator 2-norm) to measure the error of our approximation. This suggests singular value decomposition as a possible technique to approximate the action of the operator.

Note that the matrix does not have to be finite. If it is infinite, traditional methods of computing individual singular vectors will not work directly. We also require that the approximation is a Hankel matrix, which can be shown with AAK theory.

Hankel matrix transform

The Hankel matrix transform, or simply Hankel transform, of a sequence is the sequence of the determinants of the Hankel matrices formed from . Given an integer , define the corresponding –dimensional Hankel matrix as having the matrix elements Then, the sequence given by

is the Hankel transform of the sequence The Hankel transform is invariant under the binomial transform of a sequence. That is, if one writes

as the binomial transform of the sequence , then one has

Applications of Hankel matrices

Hankel matrices are formed when, given a sequence of output data, a realization of an underlying state-space or hidden Markov model is desired.[3] The singular value decomposition of the Hankel matrix provides a means of computing the A, B, and C matrices which define the state-space realization.[4] The Hankel matrix formed from the signal has been found useful for decomposition of non-stationary signals and time-frequency representation.

Method of moments for polynomial distributions

The

inverted in order to obtain the weight parameters of the polynomial distribution approximation.[5]

Positive Hankel matrices and the Hamburger moment problems

See also

Notes

  1. .
  2. ^ Fuhrmann 2012, §8.3
  3. .
  4. .
  5. ^ J. Munkhammar, L. Mattsson, J. Rydén (2017) "Polynomial probability distribution estimation using the method of moments". PLoS ONE 12(4): e0174573. https://doi.org/10.1371/journal.pone.0174573

References

  • Brent R.P. (1999), "Stability of fast algorithms for structured linear systems", Fast Reliable Algorithms for Matrices with Structure (editors—T. Kailath, A.H. Sayed), ch.4 (SIAM).
  • Fuhrmann, Paul A. (2012). A polynomial approach to linear algebra. Universitext (2 ed.). New York, NY: Springer. .