# All Science Fair Projects

## Science Fair Project Encyclopedia for Schools!

 Search    Browse    Forum  Coach    Links    Editor    Help    Tell-a-Friend    Encyclopedia    Dictionary

# Science Fair Project Encyclopedia

For information on any area of science that interests you,
enter a keyword (eg. scientific method, molecule, cloud, carbohydrate etc.).
Or else, you can start by choosing any of the categories below.

# Law of large numbers

In a statistical context, laws of large numbers imply that the average of a random sample from a large population is likely to be close to the mean of the whole population.

In probability theory, several laws of large numbers say that the average of a sequence of random variables with a common distribution converges (in the senses given below) to their common expectation, in the limit as the size of the sequence goes to infinity. Various formulations of the law of large numbers, and their associated conditions, specify convergence in different ways.

When the random variables have a finite variance, the central limit theorem extends our understanding of the convergence of their average by describing the distribution of the standardised difference between the sum of the random variables and the expectation of this sum. Regardless of the underlying distribution of the random variables, this standardised difference converges in distribution to a standard normal random variable.

The phrase "law of large numbers" is also sometimes used to refer to the principle that unlikely outcomes become likely when an event is repeated a large number of times. For example, the odds that you will win the lottery are very low; however, the odds that someone will win the lottery are quite good, provided that a large enough number of people purchased lottery tickets.

 Contents

## The weak law

The weak law of large numbers states that if X1, X2, X3, ... is an infinite sequence of random variables, all of which have the same expected value μ and are uncorrelated (i.e., the correlation between any two of them is zero), then the sample average

$\overline{X}_n=(X_1+\cdots+X_n)/n$

converges in probability to μ. Somewhat less tersely: For any positive number ε, no matter how small, we have

$\lim_{n\rightarrow\infty}\operatorname{P}\left(\left|\overline{X}_n-\mu\right|<\varepsilon\right)=1.$

Chebyshev's inequality is used to prove this result.

A consequence of the weak law of large numbers is the asymptotic equipartition property.

## The strong law

The strong law of large numbers states that if X1, X2, X3, ... is an infinite sequence of random variables that are independent and identically distributed with common expected value μ and finite variance, and if  E(|Xi|) < ∞, then

$\operatorname{P}\left(\lim_{n\rightarrow\infty}\overline{X}_n=\mu\right)=1,$

i.e., the sample average converges almost surely to μ.

If we replace the finite expectation condition with a finite second moment condition,  E(X12) < ∞, then we obtain both almost sure convergence and convergence in mean square. In either case, these conditions also imply the consequent of the weak law of large numbers, since almost sure convergence implies convergence in probability (as, indeed, does convergence in mean square).

This law justifies the intuitive interpretation of the expected value of a random variable as the "long-term average when sampling repeatedly".

## A weaker law and proof

Proofs of the above weak and strong laws of large numbers are rather involved. The consequent of the slightly weaker form below is implied by the weak law above (since convergence in distribution is implied by convergence in probability), but has a simpler proof.

Theorem. Let X1, X2, X3, ... be a sequence of random variables, independent and identically distributed with common mean μ < ∞, and define the partial sum Sn := X1 + X2 + ... +Xn. Then,  Sn / n converges in distribution to μ.

Proof. (See [1], p. 174) By Taylor's theorem for complex functions, the characteristic function of any random variable, X, with finite mean μ, can be written as

$\varphi(t) = 1 + it\mu + o(t), \quad t \rightarrow 0.$

Then, since the characteristic function of the sum of random variables is the product of their characteristic functions, the characteristic function of  Sn / n  is

$\left[\varphi\left({t \over n}\right)\right]^n = \left[1 + i\mu{t \over n} + o\left({t \over n}\right)\right]^n \, \rightarrow \, e^{it\mu}, \quad \textrm{as} \quad n \rightarrow \infty.$

The limit  eitμ  is the characteristic function of the constant random variable μ, and hence by the Lévy continuity theorem,  Sn / n converges in distribution to μ. Note that the proof of the central limit theorem, which tells us more about the convergence of the average to μ (when the variance σ 2 is finite), follows a very similar approach.

## References

03-10-2013 05:06:04