Strong Law of Large Numbers
Statement
Let be independent and identically distributed (i.i.d.) random variables with . Then the sample mean converges almost surely to the expectation:
"Almost surely" means the event of convergence has probability 1: for -almost every , the sequence of partial means converges.
Visualization
1000 fair coin flips (, ). Partial means converge to 0.5:
Mean
1.0 |*
| **
0.8 | * *
| ** **
0.6 | ****** *
| ****
0.5 |.....................*****............ ← E[X] = 0.5
| ****
0.4 | **
|
0.2 |
+----+----+----+----+----+----+--→ n
0 100 200 400 600 800 1000
Partial means stabilise around 0.5 after ~200 flips.
Numerical trace (first 10 flips: 1 0 1 0 0 1 1 0 1 1):
| 1 | 1 | 1.000 |
| 2 | 1 | 0.500 |
| 5 | 2 | 0.400 |
| 10 | 6 | 0.600 |
| 50 | 27 | 0.540 |
| 200 | 103 | 0.515 |
| 1000 | 499 | 0.499 |
Proof Sketch
- Reduce to nonneg case. Write and handle each part separately.
- Etemadi's truncation. For nonneg , define truncated variables . The contribution of the tail vanishes by integrability.
- Second moment bound on subsequence. Apply Chebyshev's inequality to the truncated means along for some . The sum of variances converges, giving a.s. convergence along the subsequence.
- Interpolation. Between consecutive subsequence points the mean moves by at most the maximum of a few terms; monotone integrability bounds control this gap.
- Conclusion. The full sequence inherits the a.s. limit .
Connections
The SLLN strengthens the Chebyshev's InequalityChebyshev's InequalityA random variable rarely deviates from its mean by more than a few standard deviationsRead more →-based Weak Law. It also underpins the Monotone Convergence TheoremMonotone Convergence TheoremA bounded monotone sequence of reals always converges — the supremum is the limitRead more → approach used inside Mathlib's proof via the Borel–Cantelli strategy.
Lean4 Proof
import Mathlib.Probability.StrongLaw
/-- Strong Law of Large Numbers (almost everywhere version).
Mathlib: `ProbabilityTheory.strong_law_ae` in
`Mathlib/Probability/StrongLaw.lean`. -/
theorem slln_alias
{Ω E : Type*} [MeasurableSpace Ω] [NormedAddCommGroup E]
[NormedSpace ℝ E] [CompleteSpace E] [SecondCountableTopology E]
[BorelSpace E]
{μ : MeasureTheory.Measure Ω} [MeasureTheory.IsProbabilityMeasure μ]
(X : ℕ → Ω → E)
(hint : MeasureTheory.Integrable (X 0) μ)
(hindep : ProbabilityTheory.iIndepFun X μ)
(hident : ∀ i, MeasureTheory.IdentDistrib (X i) (X 0) μ μ) :
∀ᵐ ω ∂μ, Filter.Tendsto
(fun n => (n : ℝ)⁻¹ • ∑ i ∈ Finset.range n, X i ω)
Filter.atTop
(nhds (∫ x, X 0 x ∂μ)) :=
ProbabilityTheory.strong_law_ae X hint hindep hidentReferenced by
- Weak Convergence (Distribution)Probability
- Kolmogorov 0-1 LawProbability
- Central Limit TheoremProbability
- Optional Stopping TheoremProbability