Hadamard's Inequality

lean4-prooflinear-algebravisualization
detAj=1nAj|\det A| \le \prod_{j=1}^n \|A_j\|

Statement

Let AA be an n×nn \times n real matrix with columns A1,A2,,AnRnA_1, A_2, \ldots, A_n \in \mathbb{R}^n. Then

detAj=1nAj,|\det A| \le \prod_{j=1}^n \|A_j\|,

where Aj=iaij2\|A_j\| = \sqrt{\sum_i a_{ij}^2} is the Euclidean norm of column jj.

Equality holds if and only if the columns are pairwise orthogonal (or at least one column is zero).

Visualization

Example 1 — identity matrix:

A=(1001),detA=1,A1=1,A2=1.A = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}, \quad \det A = 1, \quad \|A_1\| = 1, \quad \|A_2\| = 1.
detA=1=11=A1A2.Equality (orthogonal columns).|\det A| = 1 = 1 \cdot 1 = \|A_1\| \cdot \|A_2\|. \quad \text{Equality (orthogonal columns).}

Example 2 — non-orthogonal matrix:

A=(1101),detA=1,A1=1,A2=2.A = \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix}, \quad \det A = 1, \quad \|A_1\| = 1, \quad \|A_2\| = \sqrt{2}.
detA=1<2=A1A2.Strict inequality.|\det A| = 1 < \sqrt{2} = \|A_1\| \cdot \|A_2\|. \quad \text{Strict inequality.}

Geometric meaning:

ShapeVolume / AreaBound
Parallelepiped spanned by orthonormal vectors11vi=1\prod \|v_i\| = 1
Parallelepiped with tilted vectors<vi< \prod \|v_i\|always \le product of lengths

The volume of a parallelepiped is maximized (for fixed side lengths) when sides are orthogonal — exactly the equality case.

Proof Sketch

  1. Orthonormal case. If the columns of AA are already orthonormal, AA=IA^\top A = I, so detA=±1\det A = \pm 1 and Aj=1\|A_j\| = 1 for all jj. The inequality holds with 111 \le 1.
  2. Gram–Schmidt reduction. Apply Gram–Schmidt to the columns: write A=QRA = QR (QR decomposition). Then detA=detQdetR=±jrjj\det A = \det Q \cdot \det R = \pm \prod_j r_{jj}. The diagonal entries rjjr_{jj} satisfy rjj=ajAjr_{jj} = \|a_j^{\perp}\| \le \|A_j\| (projection can only shorten vectors). Hence detA=jrjjjAj|\det A| = \prod_j |r_{jj}| \le \prod_j \|A_j\|.
  3. Equality. Equality in each step requires aj=aja_j^{\perp} = a_j, i.e., aja_j is already orthogonal to all previous columns.
  4. Cauchy–Schwarz link. Each step uses ajaj\|a_j^{\perp}\| \le \|a_j\|, a direct consequence of the Cauchy–Schwarz InequalityCauchy–Schwarz Inequalityu,vuv|\langle u, v \rangle| \leq \|u\| \, \|v\|The inner product of two vectors is bounded by the product of their normsRead more → (projectionvector\|\text{projection}\| \le \|\text{vector}\|).

Connections

Hadamard's inequality follows from QR DecompositionQR DecompositionA=QR,QQ=I,  R upper triangularA = QR,\quad Q^\top Q = I,\; R \text{ upper triangular}Every invertible real matrix factors as a product of an orthogonal matrix and an upper-triangular matrix.Read more → (Gram–Schmidt) combined with Cauchy–Schwarz InequalityCauchy–Schwarz Inequalityu,vuv|\langle u, v \rangle| \leq \|u\| \, \|v\|The inner product of two vectors is bounded by the product of their normsRead more →, and is equivalent to positivity of the Gram matrix. It is used in bounding determinants in Cauchy–Binet FormulaCauchy–Binet Formuladet(AB)=S[n],S=mdet(AS)det(BS)\det(AB) = \sum_{S \subseteq [n],\, |S|=m} \det(A_S) \det(B^S)The determinant of a product AB equals the sum over all maximal minors of A times the corresponding minors of B.Read more → for rectangular matrices and in numerical analysis to estimate conditioning.

Lean4 Proof

-- We prove the 2x2 instance of Hadamard's inequality explicitly using decide-style arithmetic.
-- The key algebraic fact: |ad - bc|^2 <= (a^2+b^2)(c^2+d^2)
-- which follows from the Cauchy–Schwarz inequality (a^2+b^2)(c^2+d^2) >= (ac+bd)^2
-- and (ad-bc)^2 + (ac+bd)^2 = (a^2+b^2)(c^2+d^2).
theorem hadamard_identity_2x2 (a b c d : ) :
    (a * d - b * c) ^ 2 + (a * c + b * d) ^ 2 = (a ^ 2 + b ^ 2) * (c ^ 2 + d ^ 2) := by
  ring

-- Immediate consequence: the Hadamard bound for 2x2 real matrices.
theorem hadamard_bound_2x2 (a b c d : ) :
    (a * d - b * c) ^ 2  (a ^ 2 + b ^ 2) * (c ^ 2 + d ^ 2) := by
  nlinarith [sq_nonneg (a * c + b * d), sq_nonneg (a * d - b * c),
             hadamard_identity_2x2 a b c d]