In
Statement
Let be a probability space, and , be independent random variables defined on that space. Assume the expected values and variances exist and are finite. Also let
If this sequence of independent random variables satisfies Lindeberg's condition:
for all , where 1{…} is the indicator function, then the central limit theorem holds, i.e. the random variables
as
Lindeberg's condition is sufficient, but not in general necessary (i.e. the inverse implication does not hold in general).
However, if the sequence of independent random variables in question satisfies
then Lindeberg's condition is both sufficient and necessary, i.e. it holds if and only if the result of central limit theorem holds.
Feller's theorem
Feller's theorem can be used as an alternative method to prove that Lindeberg's condition holds.[5] Letting and for simplicity , the theorem states
- if , and converges weakly to a standard normal distribution as then satisfies the Lindeberg's condition.
This theorem can be used to disprove the central limit theorem holds for by using proof by contradiction. This procedure involves proving that Lindeberg's condition fails for .
Interpretation
Because the Lindeberg condition implies as , it guarantees that the contribution of any individual random variable () to the variance is arbitrarily small, for sufficiently large values of .
Example
Consider the following informative example which satisfies the Lindeberg condition. Let be a sequence of zero mean, variance 1 iid random variables and a non-random sequence satisfying:
Now, define the normalized elements of the linear combination:
which satisfies the Lindeberg condition:
but is finite so by DCT and the condition on the we have that this goes to 0 for every .
See also
References