Rank–Nullity Theorem

lean4-prooflinear-algebravisualization
rank(f)+nullity(f)=dimV\operatorname{rank}(f) + \operatorname{nullity}(f) = \dim V

Statement

Let f:VWf : V \to W be a linear map between finite-dimensional vector spaces over a field KK. Then:

dim(imf)+dim(kerf)=dimV\dim(\operatorname{im} f) + \dim(\ker f) = \dim V

Equivalently, rank(f)+nullity(f)=dimV\operatorname{rank}(f) + \operatorname{nullity}(f) = \dim V.

Visualization

Consider the linear map f:R3R3f : \mathbb{R}^3 \to \mathbb{R}^3 given by the matrix:

A=(123246011)A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \\ 0 & 1 & 1 \end{pmatrix}

Step 1 — Row reduce to find the image (column space).

(123246011)R2R22R1(123000011)R2R3(123011000)\begin{pmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \\ 0 & 1 & 1 \end{pmatrix} \xrightarrow{R_2 \leftarrow R_2 - 2R_1} \begin{pmatrix} 1 & 2 & 3 \\ 0 & 0 & 0 \\ 0 & 1 & 1 \end{pmatrix} \xrightarrow{R_2 \leftrightarrow R_3} \begin{pmatrix} 1 & 2 & 3 \\ 0 & 1 & 1 \\ 0 & 0 & 0 \end{pmatrix}

Two pivot columns \Rightarrow rank(A)=2\operatorname{rank}(A) = 2.

Image basis: columns 1 and 2 of AA, i.e., {(1,2,0)T,  (2,4,1)T}\{(1,2,0)^T,\; (2,4,1)^T\}.

Step 2 — Find the null space.

Free variable: x3=tx_3 = t. Back-substituting: x2=tx_2 = -t, x1=2x23x3=2t3t=tx_1 = -2x_2 - 3x_3 = 2t - 3t = -t.

kerA=span{(111)},nullity(A)=1\ker A = \operatorname{span}\left\{\begin{pmatrix} -1 \\ -1 \\ 1 \end{pmatrix}\right\}, \quad \operatorname{nullity}(A) = 1

Dimension table:

Dimension
dim(imA)\dim(\operatorname{im} A)2
dim(kerA)\dim(\ker A)1
dim(R3)\dim(\mathbb{R}^3)3
rank + nullity2+1=32 + 1 = 3

Proof Sketch

Choose a basis {k1,,km}\{k_1, \ldots, k_m\} for kerf\ker f and extend it to a basis {k1,,km,v1,,vr}\{k_1, \ldots, k_m, v_1, \ldots, v_r\} of VV. One shows that {f(v1),,f(vr)}\{f(v_1), \ldots, f(v_r)\} is a basis for imf\operatorname{im} f: they span imf\operatorname{im} f (since f(ki)=0f(k_i) = 0) and are linearly independent (any linear dependence cjf(vj)=0\sum c_j f(v_j) = 0 implies cjvjkerf\sum c_j v_j \in \ker f, forcing all cj=0c_j = 0). Hence dim(imf)=r\dim(\operatorname{im} f) = r and dimV=m+r\dim V = m + r.

Connections

  • Cayley–Hamilton TheoremCayley–Hamilton TheorempA(A)=0  where  pA(λ)=det(λIA)p_A(A) = 0 \;\text{where}\; p_A(\lambda) = \det(\lambda I - A)Every square matrix satisfies its own characteristic polynomialRead more → — the minimal polynomial degree and kernel dimensions are intertwined.
  • Spectral TheoremSpectral TheoremA=QΛQT  (real symmetric case)A = Q \Lambda Q^T \;\text{(real symmetric case)}Every real symmetric matrix is orthogonally diagonalizableRead more → — the nullity of AλIA - \lambda I is the geometric multiplicity of eigenvalue λ\lambda.
  • Cramer's RuleCramer's Rulexi=det(Ai)det(A)x_i = \frac{\det(A_i)}{\det(A)}Explicit determinant formula for the solution of a linear systemRead more → — Cramer's rule applies precisely when kerA={0}\ker A = \{0\}, i.e., nullity(A)=0\operatorname{nullity}(A) = 0.
  • Determinant MultiplicativityDeterminant Multiplicativitydet(AB)=det(A)det(B)\det(AB) = \det(A)\,\det(B)The determinant of a product equals the product of determinantsRead more →det(A)0\det(A) \neq 0 if and only if nullity(A)=0\operatorname{nullity}(A) = 0 if and only if rank(A)=n\operatorname{rank}(A) = n.

Lean4 Proof

/-- Rank–nullity theorem: for a linear map f : V → W between finite-dimensional
    K-vector spaces, the finrank of the image plus the finrank of the kernel
    equals the finrank of the domain. -/
theorem rank_nullity {K V W : Type*}
    [DivisionRing K] [AddCommGroup V] [Module K V]
    [AddCommGroup W] [Module K W]
    [FiniteDimensional K V] (f : V ₗ[K] W) :
    Module.finrank K (LinearMap.range f) +
    Module.finrank K (LinearMap.ker f) =
    Module.finrank K V :=
  LinearMap.finrank_range_add_finrank_ker f