Characteristic Function

lean4-proofprobabilityvisualization
ϕX(t)=E[eitX]=eitxdFX(x)\phi_X(t) = \mathbb{E}[e^{itX}] = \int e^{itx}\,dF_X(x)

Statement

For a real-valued random variable XX on a probability space (Ω,F,P)(\Omega, \mathcal{F}, P), the characteristic function is

ϕX(t)=E[eitX]=eitxdFX(x),tR,\phi_X(t) = \mathbb{E}[e^{itX}] = \int_{-\infty}^\infty e^{itx}\,dF_X(x), \qquad t \in \mathbb{R},

where FXF_X is the CDF of XX. The characteristic function:

  1. Always exists (since eitx=1|e^{itx}| = 1).
  2. Satisfies ϕX(0)=1\phi_X(0) = 1 and ϕX(t)1|\phi_X(t)| \le 1.
  3. Is uniformly continuous on R\mathbb{R}.
  4. Uniquely determines the distribution of XX (inversion theorem).

For the standard Gaussian XN(μ,σ2)X \sim \mathcal{N}(\mu, \sigma^2):

ϕX(t)=eiμtσ2t2/2.\phi_X(t) = e^{i\mu t - \sigma^2 t^2 / 2}.

In particular for N(0,1)\mathcal{N}(0,1): ϕX(t)=et2/2\phi_X(t) = e^{-t^2/2}.

Visualization

XBernoulli(1/2)X \sim \text{Bernoulli}(1/2): X=0X = 0 with probability 1/21/2, X=1X = 1 with probability 1/21/2.

ϕX(t)=12ei0t+12ei1t=1+eit2.\phi_X(t) = \frac{1}{2} e^{i \cdot 0 \cdot t} + \frac{1}{2} e^{i \cdot 1 \cdot t} = \frac{1 + e^{it}}{2}.

Numerical table (ϕX(t)2=cos2(t/2)|\phi_X(t)|^2 = \cos^2(t/2)):

| tt | ϕX(t)\phi_X(t) (exact) | ϕX(t)|\phi_X(t)| | |-----|---------------------|--------------| | 00 | 11 | 1.0001.000 | | π/4\pi/4 | (1+eiπ/4)/2(1 + e^{i\pi/4})/2 | 0.9240.924 | | π/2\pi/2 | (1+i)/2(1 + i)/2 | 0.7070.707 | | π\pi | 00 | 0.0000.000 | | 2π2\pi | 11 | 1.0001.000 |

The characteristic function of the sum of two independent Bernoulli(1/2) random variables:

ϕX+Y(t)=ϕX(t)2=(1+eit)24.\phi_{X+Y}(t) = \phi_X(t)^2 = \frac{(1+e^{it})^2}{4}.

This equals the characteristic function of Bin(2,1/2)\text{Bin}(2, 1/2), recovering the distribution of the sum.

Proof Sketch

Existence and normalization. ϕX(t)=E[eitX]E[eitX]=E[1]=1|\phi_X(t)| = |\mathbb{E}[e^{itX}]| \le \mathbb{E}[|e^{itX}|] = \mathbb{E}[1] = 1. At t=0t=0: ei0X=1e^{i \cdot 0 \cdot X} = 1, so ϕX(0)=E[1]=1\phi_X(0) = \mathbb{E}[1] = 1.

Uniform continuity. For any t,st, s: ϕX(t)ϕX(s)E[eitXeisX]=E[ei(ts)X1]|\phi_X(t) - \phi_X(s)| \le \mathbb{E}[|e^{itX} - e^{isX}|] = \mathbb{E}[|e^{i(t-s)X} - 1|]. By the dominated convergence theorem (dominator = 2) and eih10|e^{ih} - 1| \to 0 as h0h \to 0, continuity follows.

Inversion (Lévy inversion formula). Given ϕX\phi_X, one recovers FXF_X via: FX(b)FX(a)=limT12πTTeitaeitbitϕX(t)dtF_X(b) - F_X(a) = \lim_{T\to\infty} \frac{1}{2\pi} \int_{-T}^T \frac{e^{-ita} - e^{-itb}}{it} \phi_X(t)\,dt.

Connections

The characteristic function is the Fourier transform of the distribution. The Central Limit TheoremCentral Limit Theoremi=1nXinμσndN(0,1)\frac{\sum_{i=1}^n X_i - n\mu}{\sigma\sqrt{n}} \xrightarrow{d} \mathcal{N}(0,1)Standardized sums of i.i.d. finite-variance random variables converge in distribution to the standard normalRead more → is most cleanly proved by showing ϕZn(t)et2/2\phi_{Z_n}(t) \to e^{-t^2/2} pointwise — the characteristic function of N(0,1)\mathcal{N}(0,1). The Lévy continuity theorem converts pointwise convergence of ϕ\phi into Weak Convergence (Distribution)Weak Convergence (Distribution)μnwμ    fdμnfdμ for all fCb\mu_n \xrightarrow{w} \mu \iff \int f\,d\mu_n \to \int f\,d\mu \text{ for all } f \in C_bA sequence of probability measures converges weakly if integrals of bounded continuous functions convergeRead more →.

Lean4 Proof

import Mathlib.MeasureTheory.Measure.CharacteristicFunction
import Mathlib.MeasureTheory.Measure.MeasureSpace

open MeasureTheory Complex

/-- The characteristic function of a probability measure equals 1 at t = 0.
    Mathlib: `MeasureTheory.charFun_zero` composed with
    `IsProbabilityMeasure.measure_univ` and `Measure.real`. -/
theorem charFun_at_zero
    {E : Type*} [SeminormedAddCommGroup E] [InnerProductSpace  E]
    (μ : Measure E) [IsProbabilityMeasure μ] :
    charFun μ (0 : E) = 1 := by
  simp [charFun_zero, Measure.real, IsProbabilityMeasure.measure_univ,
        ENNReal.one_toReal]