Cauchy–Schwarz Inequality
Statement
For any two vectors and in an inner product space:
Equality holds if and only if and are linearly dependent (one is a scalar multiple of the other).
In with the dot product, this reads:
The geometric consequence: for the angle between two vectors.
Visualization
Worked example in with and :
u = (1, 2, 2) v = (2, 1, -2)
dot product: norms:
u·v = 1·2 ‖u‖ = √(1²+2²+2²) = √9 = 3
+ 2·1 ‖v‖ = √(2²+1²+2²) = √9 = 3
+ 2·(-2)
= 2+2-4 = 0 ‖u‖·‖v‖ = 3·3 = 9
|u·v| = 0 ≤ 9 = ‖u‖·‖v‖ ✓ (vectors are orthogonal)
Another example with and :
| Quantity | Value |
|---|---|
| <span class="math-inline" data-latex=" | u \cdot v |
| , so |
When and (parallel), equality holds: .
Proof Sketch
Fix and consider the real-valued function :
This quadratic in is always non-negative, so its discriminant must be :
Therefore .
Connections
Cauchy–Schwarz is a cornerstone of analysis and algebra:
- AM–GM InequalityAM–GM InequalityThe arithmetic mean is always at least as large as the geometric meanRead more → — a one-dimensional shadow; both follow from convexity / Jensen's inequality
- Quadratic FormulaQuadratic FormulaSolve any quadratic equation using the discriminantRead more → — the discriminant condition in disguise
- Binomial TheoremBinomial TheoremExpanding powers of a sum via binomial coefficientsRead more → — the "cross term" bound in is Cauchy–Schwarz
- Geometric SeriesGeometric SeriesPartial sums and convergence of the geometric progressionRead more → — Bessel's inequality for orthonormal expansions uses Cauchy–Schwarz
- Vieta Formulas — in the Gram matrix approach to polynomial roots, Cauchy–Schwarz controls error
Lean4 Proof
Mathlib provides norm_inner_le_norm in Mathlib.Analysis.InnerProductSpace.Basic, which is exactly
the Cauchy–Schwarz inequality for inner product spaces.
Lean4 Proof
import Mathlib.Analysis.InnerProductSpace.Basic
/-- Cauchy–Schwarz inequality: the norm of the inner product is bounded by
the product of the norms. This is `norm_inner_le_norm` in Mathlib. -/
theorem cauchy_schwarz
{E : Type*} [SeminormedAddCommGroup E] [InnerProductSpace ℝ E]
(u v : E) : ‖(inner u v : ℝ)‖ ≤ ‖u‖ * ‖v‖ :=
norm_inner_le_norm u v
/-- Consequence: the cosine of the angle between two vectors has absolute value ≤ 1. -/
theorem cos_angle_le_one
{E : Type*} [SeminormedAddCommGroup E] [InnerProductSpace ℝ E]
(u v : E) (hu : ‖u‖ ≠ 0) (hv : ‖v‖ ≠ 0) :
‖(inner u v : ℝ)‖ / (‖u‖ * ‖v‖) ≤ 1 := by
apply div_le_one_of_le
· exact norm_inner_le_norm u v
· positivity
/-- The real dot-product version: (Σ uᵢvᵢ)² ≤ (Σ uᵢ²)(Σ vᵢ²). -/
theorem cauchy_schwarz_sum (n : ℕ) (u v : Fin n → ℝ) :
(∑ i, u i * v i) ^ 2 ≤ (∑ i, u i ^ 2) * (∑ i, v i ^ 2) := by
have h := norm_inner_le_norm (𝕜 := ℝ) (E := EuclideanSpace ℝ (Fin n)) u v
simp only [EuclideanSpace.inner_eq_star_mulVec, Matrix.dotProduct, EuclideanSpace.norm_sq] at h
nlinarith [sq_nonneg (∑ i, u i * v i), h]Referenced by
- Nash Equilibrium ExistenceOptimization
- Convex FunctionOptimization
- Jensen's Inequality (Convex)Optimization
- Linear Programming DualityOptimization
- Lagrange MultipliersOptimization
- KKT ConditionsOptimization
- Minkowski's InequalityProbability
- Holder's InequalityProbability
- Holder's InequalityProbability
- Jensen's InequalityProbability
- Lagrange's Four-Square TheoremNumber Theory
- Polar DecompositionLinear Algebra
- Hadamard's InequalityLinear Algebra
- Hadamard's InequalityLinear Algebra
- Channel CapacityInformation Theory
- Hahn–Banach TheoremFunctional Analysis
- Stone–Weierstrass TheoremFunctional Analysis
- Uniform Boundedness PrincipleFunctional Analysis
- Riesz Representation TheoremFunctional Analysis