Autocorrelation
Part of a series on Statistics |
Correlation and covariance |
---|
Autocorrelation, sometimes known as serial correlation in the
Different fields of study define autocorrelation differently, and not all of these definitions are equivalent. In some fields, the term is used interchangeably with autocovariance.
Auto-correlation of stochastic processes
In
|
(Eq.1)
|
where is the
Subtracting the mean before multiplication yields the auto-covariance function between times and :[1]: p.392 [2]: p.168
|
(Eq.2)
|
Note that this expression is not well defined for all time series or processes, because the mean may not exist, or the variance may be zero (for a constant process) or infinite (for processes with distribution lacking well-behaved moments, such as certain types of power law).
Definition for wide-sense stationary stochastic process
If is a
|
(Eq.3)
|
and the auto-covariance function:
|
(Eq.4)
|
In particular, note that
Normalization
It is common practice in some disciplines (e.g. statistics and
The definition of the auto-correlation coefficient of a stochastic process is[2]: p.169
If the function is well defined, its value must lie in the range , with 1 indicating perfect correlation and −1 indicating perfect
For a wide-sense stationary (WSS) process, the definition is
The normalization is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength of
Properties
Symmetry property
The fact that the auto-correlation function is an
Maximum at zero
For a WSS process:[2]: p.174
Cauchy–Schwarz inequality
The Cauchy–Schwarz inequality, inequality for stochastic processes:[1]: p.392
Autocorrelation of white noise
The autocorrelation of a continuous-time white noise signal will have a strong peak (represented by a Dirac delta function) at and will be exactly for all other .
Wiener–Khinchin theorem
The Wiener–Khinchin theorem relates the autocorrelation function to the power spectral density via the Fourier transform:
For real-valued functions, the symmetric autocorrelation function has a real symmetric transform, so the Wiener–Khinchin theorem can be re-expressed in terms of real cosines only:
Auto-correlation of random vectors
The (potentially time-dependent) auto-correlation matrix (also called second moment) of a (potentially time-dependent)
For a
|
(Eq.5)
|
where denotes the transposed matrix of dimensions .
Written component-wise:
If is a complex random vector, the autocorrelation matrix is instead defined by
Here denotes
For example, if is a random vector, then is a matrix whose -th entry is .
Properties of the autocorrelation matrix
- The autocorrelation matrix is a Hermitian matrix for complex random vectors and a symmetric matrix for real random vectors.[3]: p.190
- The autocorrelation matrix is a positive semidefinite matrix,[3]: p.190 i.e. for a real random vector, and respectively in case of a complex random vector.
- All eigenvalues of the autocorrelation matrix are real and non-negative.
- The auto-covariance matrix is related to the autocorrelation matrix as follows:Respectively for complex random vectors:
Auto-correlation of deterministic signals
In signal processing, the above definition is often used without the normalization, that is, without subtracting the mean and dividing by the variance. When the autocorrelation function is normalized by mean and variance, it is sometimes referred to as the autocorrelation coefficient[4] or autocovariance function.
Auto-correlation of continuous-time signal
Given a
|
(Eq.6)
|
where represents the complex conjugate of . Note that the parameter in the integral is a dummy variable and is only necessary to calculate the integral. It has no specific meaning.
Auto-correlation of discrete-time signal
The discrete autocorrelation at lag for a discrete-time signal is
|
(Eq.7)
|
The above definitions work for signals that are square integrable, or square summable, that is, of finite energy. Signals that "last forever" are treated instead as random processes, in which case different definitions are needed, based on expected values. For
For processes that are not stationary, these will also be functions of , or .
For processes that are also ergodic, the expectation can be replaced by the limit of a time average. The autocorrelation of an ergodic process is sometimes defined as or equated to[4]
These definitions have the advantage that they give sensible well-defined single-parameter results for periodic functions, even when those functions are not the output of stationary ergodic processes.
Alternatively, signals that last forever can be treated by a short-time autocorrelation function analysis, using finite time integrals. (See short-time Fourier transform for a related process.)
Definition for periodic signals
If is a continuous periodic function of period , the integration from to is replaced by integration over any interval of length :
which is equivalent to
Properties
In the following, we will describe properties of one-dimensional autocorrelations only, since most properties are easily transferred from the one-dimensional case to the multi-dimensional cases. These properties hold for wide-sense stationary processes.[5]
- A fundamental property of the autocorrelation is symmetry, , which is easy to prove from the definition. In the continuous case,
- the autocorrelation is an even functionwhen is a real function, and
- the autocorrelation is a Hermitian function when is a complex function.
- the autocorrelation is an
- The continuous autocorrelation function reaches its peak at the origin, where it takes a real value, i.e. for any delay , .[1]: p.410 This is a consequence of the rearrangement inequality. The same result holds in the discrete case.
- The autocorrelation of a periodic function is, itself, periodic with the same period.
- The autocorrelation of the sum of two completely uncorrelated functions (the cross-correlation is zero for all ) is the sum of the autocorrelations of each function separately.
- Since autocorrelation is a specific type of cross-correlation, it maintains all the properties of cross-correlation.
- By using the symbol to represent convolution and is a function which manipulates the function and is defined as , the definition for may be written as:
Multi-dimensional autocorrelation
Multi-
When mean values are subtracted from signals before computing an autocorrelation function, the resulting function is usually called an auto-covariance function.
Efficient computation
For data expressed as a
Thus the required autocorrelation sequence is , where and the autocorrelation for other lag values being zero. In this calculation we do not perform the carry-over operation during addition as is usual in normal multiplication. Note that we can halve the number of operations required by exploiting the inherent symmetry of the autocorrelation. If the signal happens to be periodic, i.e. then we get a circular autocorrelation (similar to circular convolution) where the left and right tails of the previous autocorrelation sequence will overlap and give which has the same period as the signal sequence The procedure can be regarded as an application of the convolution property of Z-transform of a discrete signal.
While the brute force algorithm is order n2, several efficient algorithms exist which can compute the autocorrelation in order n log(n). For example, the Wiener–Khinchin theorem allows computing the autocorrelation from the raw data X(t) with two fast Fourier transforms (FFT):[6][page needed]
where IFFT denotes the inverse fast Fourier transform. The asterisk denotes complex conjugate.
Alternatively, a multiple τ correlation can be performed by using brute force calculation for low τ values, and then progressively binning the X(t) data with a logarithmic density to compute higher values, resulting in the same n log(n) efficiency, but with lower memory requirements.[7][8]
Estimation
For a
for any positive integer . When the true mean and variance are known, this estimate is
- If and are replaced by the standard formulae for sample mean and sample variance, then this is a biased estimate.
- A periodogram-based estimate replaces in the above formula with . This estimate is always biased; however, it usually has a smaller mean squared error.[9][10]
- Other possibilities derive from treating the two portions of data and separately and calculating separate sample means and/or sample variances for use in defining the estimate.[citation needed]
The advantage of estimates of the last type is that the set of estimated autocorrelations, as a function of , then form a function which is a valid autocorrelation in the sense that it is possible to define a theoretical process having exactly that autocorrelation. Other estimates can suffer from the problem that, if they are used to calculate the variance of a linear combination of the 's, the variance calculated may turn out to be negative.[11]
Regression analysis
In
In
The traditional test for the presence of first-order autocorrelation is the Durbin–Watson statistic or, if the explanatory variables include a lagged dependent variable, Durbin's h statistic. The Durbin-Watson can be linearly mapped however to the Pearson correlation between values and their lags.[12] A more flexible test, covering autocorrelation of higher orders and applicable whether or not the regressors include lags of the dependent variable, is the Breusch–Godfrey test. This involves an auxiliary regression, wherein the residuals obtained from estimating the model of interest are regressed on (a) the original regressors and (b) k lags of the residuals, where 'k' is the order of the test. The simplest version of the test statistic from this auxiliary regression is TR2, where T is the sample size and R2 is the coefficient of determination. Under the null hypothesis of no autocorrelation, this statistic is asymptotically distributed as with k degrees of freedom.
Responses to nonzero autocorrelation include
In the estimation of a
Applications
Autocorrelation's ability to find repeating patterns in data yields many applications, including:
- Autocorrelation analysis is used heavily in fluorescence correlation spectroscopy[14] to provide quantitative insight into molecular-level diffusion and chemical reactions.[15]
- Another application of autocorrelation is the measurement of optical spectra and the measurement of very-short-duration light pulses produced by lasers, both using optical autocorrelators.
- Autocorrelation is used to analyze speckle patternthat results from the motion of the particles. Autocorrelation of the signal can be analyzed in terms of the diffusion of the particles. From this, knowing the viscosity of the fluid, the sizes of the particles can be calculated.
- Utilized in the doppler shift in the incoming satellite signal, until the receiver replica signal and the satellite signal codes match up.[16]
- The small-angle X-ray scattering intensity of a nanostructured system is the Fourier transform of the spatial autocorrelation function of the electron density.
- In surface science and scanning probe microscopy, autocorrelation is used to establish a link between surface morphology and functional characteristics.[17]
- In optics, normalized autocorrelations and cross-correlations give the degree of coherenceof an electromagnetic field.
- In astronomy, autocorrelation can determine the frequency of pulsars.
- In music, autocorrelation (when applied at time scales smaller than a second) is used as a pitch detection algorithm for both instrument tuners and "Auto Tune" (used as a distortion effect or to fix intonation).[18] When applied at time scales larger than a second, autocorrelation can identify the musical beat, for example to determine tempo.
- Autocorrelation in space rather than time, via the Patterson function, is used by X-ray diffractionists to help recover the "Fourier phase information" on atom positions not available through diffraction alone.
- In statistics, spatial autocorrelation between sample locations also helps one estimate mean value uncertaintieswhen sampling a heterogeneous population.
- The SEQUEST algorithm for analyzing mass spectra makes use of autocorrelation in conjunction with cross-correlation to score the similarity of an observed spectrum to an idealized spectrum representing a peptide.
- In astrophysics, autocorrelation is used to study and characterize the spatial distribution of galaxies in the universe and in multi-wavelength observations of low mass X-ray binaries.
- In panel data, spatial autocorrelation refers to correlation of a variable with itself through space.
- In analysis of Markov chain Monte Carlo data, autocorrelation must be taken into account for correct error determination.
- In geosciences (specifically in geophysics) it can be used to compute an autocorrelation seismic attribute, out of a 3D seismic survey of the underground.
- In medical ultrasound imaging, autocorrelation is used to visualize blood flow.
- In intertemporal portfolio choice, the presence or absence of autocorrelation in an asset's rate of return can affect the optimal portion of the portfolio to hold in that asset.
- In numerical relays, autocorrelation has been used to accurately measure power system frequency.[19]
Serial dependence
Serial dependence is closely linked to the notion of autocorrelation, but represents a distinct concept (see
A time series of a random variable has serial dependence if the value at some time in the series is
If a time series is stationary, then statistical dependence between the pair would imply that there is statistical dependence between all pairs of values at the same lag .
See also
- Autocorrelation matrix
- Autocorrelation of a formal word
- Autocorrelation technique
- Autocorrelator
- Cochrane–Orcutt estimation (transformation for autocorrelated error terms)
- Correlation function
- Correlogram
- Cross-correlation
- CUSUM
- Fluorescence correlation spectroscopy
- Optical autocorrelation
- Partial autocorrelation function
- Phylogenetic autocorrelation (Galton's problem}
- Pitch detection algorithm
- Prais–Winsten transformation
- Scaled correlation
- Triple correlation
- Unbiased estimation of standard deviation
References
- ^ ISBN 978-0-521-86470-1.
- ^ ISBN 978-3-319-68074-3
- ^ a b c Papoulis, Athanasius, Probability, Random variables and Stochastic processes, McGraw-Hill, 1991
- ^ ISBN 978-0-07-282538-1.
- ISBN 978-0130617934.
- ISBN 978-0130607744.
- ISBN 978-0122673511.
- S2CID 7173093.
- ISBN 978-0125649018.
- ISBN 978-0-521-43541-3.
- .
- ^ "Serial correlation techniques". Statistical Ideas. 26 May 2014.
- ISBN 978-1-59718-013-9.
- PMID 22208184.
- PMID 28106203.
- ISBN 978-0-8493-9195-8.
- S2CID 198468676.
- ^ Tyrangiel, Josh (2009-02-05). "Auto-Tune: Why Pop Music Sounds Perfect". Time. Archived from the original on February 10, 2009.
- ^ Kasztenny, Bogdan (March 2016). "A New Method for Fast Frequency Measurement for Protection Applications" (PDF). Schweitzer Engineering Laboratories. Archived (PDF) from the original on 2022-10-09. Retrieved 28 May 2022.
Further reading
- ISBN 978-0-02-365070-3.
- ISBN 978-1-119-40110-0.
- Mojtaba Soltanalian, and Petre Stoica. "Computational design of sequences with good correlation properties." IEEE Transactions on Signal Processing, 60.5 (2012): 2180–2193.
- Solomon W. Golomb, and Guang Gong. Signal design for good correlation: for wireless communication, cryptography, and radar. Cambridge University Press, 2005.
- Klapetek, Petr (2018). Quantitative Data Processing in Scanning Probe Microscopy: SPM Applications for Nanometrology (Second ed.). Elsevier. pp. 108–112 ISBN 9780128133477.
- Weisstein, Eric W. "Autocorrelation". MathWorld.