Channel Capacity
Statement
The channel capacity of a discrete memoryless channel is
where the maximum is over all input distributions and is the mutual information. For the binary symmetric channel (BSC) with crossover probability :
(measured in bits, with the binary entropy in bits).
Boundary values: (perfect channel), (pure noise).
Visualization
BSC capacity as a function of crossover probability :
| (bits) | ||
|---|---|---|
| 0.00 | 0.000 | 1.000 |
| 0.05 | 0.286 | 0.714 |
| 0.10 | 0.469 | 0.531 |
| 0.15 | 0.610 | 0.390 |
| 0.25 | 0.811 | 0.189 |
| 0.50 | 1.000 | 0.000 |
ASCII sketch of :
C
1 *
|*
| *
| *
| *
| *
| *
0 +-------------------> p
0 1/4 1/2
The BSC is symmetric about , so . The maximum input distribution achieving capacity is uniform ().
Proof Sketch
- For the BSC, where independent of .
- .
- with equality when is uniform, achieved by the uniform input.
- Therefore .
- At : , so . At : , so .
Connections
Channel capacity is defined via Mutual InformationMutual InformationI(X;Y) = H(X) + H(Y) - H(X,Y) measures how much knowing Y reduces uncertainty about XRead more → and saturated by the Gaussian channel variant of the Cauchy–Schwarz InequalityCauchy–Schwarz InequalityThe inner product of two vectors is bounded by the product of their normsRead more →. The noiseless channel () achieves the limit established by the Source Coding TheoremSource Coding TheoremShannon's noiseless coding theorem: optimal prefix code lengths satisfy H(X) <= E[l] < H(X) + 1Read more →.
Lean4 Proof
import Mathlib.Analysis.SpecialFunctions.BinaryEntropy
open Real
/-- BSC capacity at crossover p=0: no noise means C=1. -/
theorem bsc_capacity_zero : 1 - binEntropy (0 : ℝ) = 1 := by
simp [binEntropy_zero]
/-- BSC capacity at crossover p=1/2: maximum noise means C=0. -/
theorem bsc_capacity_half : 1 - binEntropy (2 : ℝ)⁻¹ = 1 - log 2 := by
simp [binEntropy_two_inv]
/-- BSC capacity at p=1 equals BSC capacity at p=0 by symmetry of binEntropy. -/
theorem bsc_capacity_symm (p : ℝ) : binEntropy p = binEntropy (1 - p) :=
(binEntropy_one_sub p).symmReferenced by
- Fano's InequalityInformation Theory
- Shannon EntropyInformation Theory
- Mutual InformationInformation Theory