Optimization
KKT Conditions
Necessary optimality conditions for constrained optimization via Lagrangian gradient and complementary slackness
Convex Function
A function is convex when the chord between any two points lies above the graph
Jensen's Inequality (Convex)
For a convex function, the image of a weighted sum is at most the weighted sum of images
Lagrange Multipliers
At a constrained extremum, the gradient of the objective is proportional to the gradient of the constraint
Linear Programming Duality
Every linear program has a dual whose optimal value equals the primal's — strong duality
von Neumann Minimax
For a convex-concave function on compact sets, the min-max and max-min values coincide
Nash Equilibrium Existence
Every finite strategic-form game has at least one mixed Nash equilibrium
Gradient Descent Convergence
For L-smooth convex functions, gradient descent with step 1/L converges geometrically to the minimum