In
Statement
Let
be a zero-mean stationary Gaussian random process and
where
is a nonlinear amplitude distortion.
If
is the
autocorrelation function
of
![{\displaystyle \left\{X(t)\right\}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/cd762a1290927e3d62d28ee6acd346167694d289)
, then the
cross-correlation function
of
![{\displaystyle \left\{X(t)\right\}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/cd762a1290927e3d62d28ee6acd346167694d289)
and
![{\displaystyle \left\{Y(t)\right\}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a8197ae863b9587bac858aa7e6f3f82f202d6020)
is
![{\displaystyle R_{XY}(\tau )=CR_{X}(\tau ),}](https://wikimedia.org/api/rest_v1/media/math/render/svg/25f4e76776a78944732a9cbe4db897455b50dfe2)
where
is a constant that depends only on
.
It can be further shown that
![{\displaystyle C={\frac {1}{\sigma ^{3}{\sqrt {2\pi }}}}\int _{-\infty }^{\infty }ug(u)e^{-{\frac {u^{2}}{2\sigma ^{2}}}}\,du.}](https://wikimedia.org/api/rest_v1/media/math/render/svg/6a89be0d3cdebdcc2c47520968df545c8859a696)
Derivation for One-bit Quantization
It is a property of the two-dimensional normal distribution that the joint density of
and
depends only on their covariance and is given explicitly by the expression
![{\displaystyle p(y_{1},y_{2})={\frac {1}{2\pi {\sqrt {1-\rho ^{2}}}}}e^{-{\frac {y_{1}^{2}+y_{2}^{2}-2\rho y_{1}y_{2}}{2(1-\rho ^{2})}}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5d6eb97b4f56373c70703e798674b57798889ca1)
where
and
are standard Gaussian random variables with correlation
.
Assume that
, the correlation between
and
is,
.
Since
,
the correlation
may be simplified as
.
The integral above is seen to depend only on the distortion characteristic
and is independent of
.
Remembering that
, we observe that for a given distortion characteristic
, the ratio
is
.
Therefore, the correlation can be rewritten in the form
.
The above equation is the mathematical expression of the stated "Bussgang‘s theorem".
If
, or called one-bit quantization, then
.
[2][3][1][4]
Arcsine law
If the two random variables are both distorted, i.e.,
, the correlation of
and
is
.
When
, the expression becomes,
![{\displaystyle \phi _{r_{1}r_{2}}={\frac {1}{2\pi {\sqrt {1-\rho ^{2}}}}}\left[\int _{0}^{\infty }\int _{0}^{\infty }e^{-\alpha }\,dy_{1}dy_{2}+\int _{-\infty }^{0}\int _{-\infty }^{0}e^{-\alpha }\,dy_{1}dy_{2}-\int _{0}^{\infty }\int _{-\infty }^{0}e^{-\alpha }\,dy_{1}dy_{2}-\int _{-\infty }^{0}\int _{0}^{\infty }e^{-\alpha }\,dy_{1}dy_{2}\right]}](https://wikimedia.org/api/rest_v1/media/math/render/svg/425ab751045a9cff9d156db7d9a5efb8c6431745)
where
.
Noticing that
,
and
,
,
we can simplify the expression of
as
![{\displaystyle \phi _{r_{1}r_{2}}={\frac {4}{2\pi {\sqrt {1-\rho ^{2}}}}}\int _{0}^{\infty }\int _{0}^{\infty }e^{-\alpha }\,dy_{1}dy_{2}-1}](https://wikimedia.org/api/rest_v1/media/math/render/svg/dbe545f33c983a0e70d326af40b0d281576aacc9)
Also, it is convenient to introduce the polar coordinate
. It is thus found that
.
Integration gives
,
This is called "Arcsine law", which was first found by J. H. Van Vleck in 1943 and republished in 1966.[2][3] The "Arcsine law" can also be proved in a simpler way by applying Price's Theorem.[4][5]
The function
can be approximated as
when
is small.
Price's Theorem
Given two jointly normal random variables
and
with joint probability function
,
we form the mean
![{\displaystyle I(\rho )=E(g(y_{1},y_{2}))=\int _{-\infty }^{+\infty }\int _{-\infty }^{+\infty }g(y_{1},y_{2})p(y_{1},y_{2})\,dy_{1}dy_{2}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/21c74473c8ffa45a27688e3690e68ef7e7990d1d)
of some function
of
. If
as
, then
.
Proof. The joint characteristic function of the random variables
and
is by definition the integral
.
From the two-dimensional inversion formula of Fourier transform, it follows that
.
Therefore, plugging the expression of
into
, and differentiating with respect to
, we obtain
![{\displaystyle {\begin{aligned}{\frac {\partial ^{n}I(\rho )}{\partial \rho ^{n}}}&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }g(y_{1},y_{2})p(y_{1},y_{2})\,dy_{1}dy_{2}\\&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }g(y_{1},y_{2})\left({\frac {1}{4\pi ^{2}}}\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }{\frac {\partial ^{n}\Phi (\omega _{1},\omega _{2})}{\partial \rho ^{n}}}e^{-j(\omega _{1}y_{1}+\omega _{2}y_{2})}\,d\omega _{1}d\omega _{2}\right)\,dy_{1}dy_{2}\\&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }g(y_{1},y_{2})\left({\frac {(-1)^{n}}{4\pi ^{2}}}\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }\omega _{1}^{n}\omega _{2}^{n}\Phi (\omega _{1},\omega _{2})e^{-j(\omega _{1}y_{1}+\omega _{2}y_{2})}\,d\omega _{1}d\omega _{2}\right)\,dy_{1}dy_{2}\\&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }g(y_{1},y_{2})\left({\frac {1}{4\pi ^{2}}}\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }\Phi (\omega _{1},\omega _{2}){\frac {\partial ^{2n}e^{-j(\omega _{1}y_{1}+\omega _{2}y_{2})}}{\partial y_{1}^{n}\partial y_{2}^{n}}}\,d\omega _{1}d\omega _{2}\right)\,dy_{1}dy_{2}\\&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }g(y_{1},y_{2}){\frac {\partial ^{2n}p(y_{1},y_{2})}{\partial y_{1}^{n}\partial y_{2}^{n}}}\,dy_{1}dy_{2}\\\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0fa21080c0c313cef8b6338b32d0365174f3f62e)
After repeated integration by parts and using the condition at
, we obtain the Price's theorem.
![{\displaystyle {\begin{aligned}{\frac {\partial ^{n}I(\rho )}{\partial \rho ^{n}}}&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }g(y_{1},y_{2}){\frac {\partial ^{2n}p(y_{1},y_{2})}{\partial y_{1}^{n}\partial y_{2}^{n}}}\,dy_{1}dy_{2}\\&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }{\frac {\partial ^{2}g(y_{1},y_{2})}{\partial y_{1}\partial y_{2}}}{\frac {\partial ^{2n-2}p(y_{1},y_{2})}{\partial y_{1}^{n-1}\partial y_{2}^{n-1}}}\,dy_{1}dy_{2}\\&=\cdots \\&=\int _{-\infty }^{\infty }\int _{-\infty }^{\infty }{\frac {\partial ^{2n}g(y_{1},y_{2})}{\partial y_{1}^{n}\partial y_{2}^{n}}}p(y_{1},y_{2})\,dy_{1}dy_{2}\end{aligned}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/557fc86395a952661a99f772473e6024492df206)
[4][5]
Proof of Arcsine law by Price's Theorem
If
, then
where
is the Dirac delta function.
Substituting into Price's Theorem, we obtain,
.
When
,
. Thus
,
which is Van Vleck's well-known result of "Arcsine law".
[2][3]
Application
This theorem implies that a simplified correlator can be designed.[clarification needed] Instead of having to multiply two signals, the cross-correlation problem reduces to the gating[clarification needed] of one signal with another.[citation needed]
References
- ^ a b J.J. Bussgang,"Cross-correlation function of amplitude-distorted Gaussian signals", Res. Lab. Elec., Mas. Inst. Technol., Cambridge MA, Tech. Rep. 216, March 1952.
- ^ a b c Vleck, J. H. Van. "The Spectrum of Clipped Noise". Radio Research Laboratory Report of Harvard University (51).
- ^ .
- ^ .
- ^ .
Further reading