5.2 Local optima
One variable
Occasionally we are interested only in the local maximizers or minimizers of a function. We may be able to tell whether a stationary point point is a local maximizer, a local minimizer, or neither by examining the second derivative of the function at the stationary point. Proposition (Secondorder conditions for local optimum of function of one variable) proof

Let f be a twicedifferentiable function of a single variable with a continuous second derivative, defined on the interval I. Suppose that f'(x*) = 0 for some x* in the
interior of I (so that x* is a stationary point of f).
 If f"(x*) < 0 then x* is a local maximizer of f.
 If x* is a local maximizer of f then f"(x*) ≤ 0.
 If f"(x*) > 0 then x* is a local minimizer of f.
 If x* is a local minimizer of f then f"(x*) ≥ 0.
 Proof hide

To prove the first point, note that given that f" is the derivative of f', we have
f"(x*) = lim_{h→0} f'(x* + h) − f'(x*) h . f"(x*) = lim_{h→0} f'(x* + h) h . The proof of the third point is similar.
To prove the second point, suppose that x* is a local maximizer of f. If f"(x*) < 0 then from the second point, x* is also a local minimizer of f. But then for some ε > 0, f is constant on the interval (x* − ε, x* + ε), in which case f"(x*) = 0, contradicting f"(x*) < 0.
The proof of the fourth point is similar.
Many variables
As for a function of a single variable, a stationary point of a function of many variables may be a local maximizer, a local minimizer, or neither, and we may be able to distinguish the cases by examining the secondorder derivatives of the function at the stationary point.Let (x_{0}, y_{0}) be a stationary point of the function f of two variables. Suppose it is a local maximizer. Then certainly it must be a maximizer along the two lines through (x_{0}, y_{0}) parallel to the axes. Using the theory for functions of a single variable, we conclude that
However, even the variant of this condition in which both inequalities are strict is not sufficient for (x_{0}, y_{0}) to be a maximizer, as the following example shows.
 Example

Consider the function f(x, y) = 3xy − x^{2} − y^{2}. The firstorder conditions are
f'_{1}(x, y) = 3y − 2x = 0 f'_{2}(x, y) = 3x − 2y = 0 f"_{11}(0, 0) = −2 ≤ 0 f"_{22}(x, y) = −2 ≤ 0. The point (0, 0) is a col. If you walk due north or due south, you descend; if you walk due east or due west, you also descend. But if you walk northeast or southwest, you climb mountains. The Col d'Arratille is an example.
The function is plotted in the following figure. Rotate the figure by dragging your mouse. (Rotation on a touch device isn't possible.)
The next result gives a condition that involves the definiteness of the Hessian of the function, and thus all the crosspartials. The result assumes that all the secondorder partial derivatives f"_{ij} are continuous for all x in some set S, so that by Young's theorem we have f"_{ij}(x) = f"_{ji}(x) for all x ∈ S, and hence the Hessian is symmetric. (The condition on f is satisfied, for example, by any polynomial.)
 Proposition (Secondorder conditions for local optimum of function of many variables) source

Let f be a twicedifferentiable function of n variables with continuous partial derivatives and cross partial derivatives, defined on the set S. Suppose that
f'_{i}(x*) = 0 for i = 1, ..., n for some x* in the interior of S (so that x* is a stationary point of f). Let H be the
Hessian of f.
 If H(x*) is negative definite then x* is a local maximizer of f.
 If x* is a local maximizer of f then H(x*) is negative semidefinite.
 If H(x*) is positive definite then x* is a local minimizer of f.
 If x* is a local minimizer of f then H(x*) is positive semidefinite.
 Source hide
 Proofs may be found in Sydsæter (1981) (Theorem 5.11, p. 243) and Simon and Blume (1994) (Theorem 30.10, p. 836).
 if H(x*) is negative definite then x* is a local maximizer
 if H(x*) is negative semidefinite, but neither negative definite nor positive semidefinite, then x* is not a local minimizer, but might be a local maximizer
 if H(x*) is positive definite then x* is a local minimizer
 if H(x*) is positive semidefinite, but neither positive definite nor negative semidefinite, then x* is not a local maximizer, but might be a local minimizer
 if H(x*) is neither positive semidefinite nor negative semidefinite then x* is neither a local maximizer nor a local minimizer.

. 
Similarly, a sufficient condition for a stationary point x* of a function of two variables to be a local minimizer are f"_{11}(x*) > 0 and H(x*) > 0 (which imply that f"_{22}(x*) > 0).
In particular, if, for a function of two variables, H(x*) < 0, then x* is neither a local maximizer nor a local minimizer. (Note that this condition is only sufficient, not necessary.)
A stationary point that is neither a local maximizer nor a local minimizer is called a saddle point. Examples are the point (0, 0) for the function f(x, y) = x^{2} − y^{2} and the point (0, 0) for the function f(x, y) = x^{4} − y^{4}. In both cases, (0, 0) is a maximizer in the y direction given x = 0 and a minimizer in the x direction given y = 0; the graph of each function resembles a saddle for a horse. Note that not all saddle points look like saddles. For example, every point (0, y) is a saddle point of the function f(x, y) = x^{3}. From the results above, a sufficient, though not necessary, condition for a stationary point x* of a function f of two variables to be a saddle point is H(x*) < 0.
A saddle point is sometimes defined to be a stationary point at which the Hessian is indefinite. (See, for example, Mathematics for economists by Simon and Blume, p. 399.) Under this definition, (0, 0) is a saddle point of f(x, y) = x^{2} − y^{2}, but is not a saddle point of f(x, y) = x^{4} − y^{4}. The definition I give appears to be more standard.
 Example

Consider the function f(x, y) = x^{3} + y^{3} − 3xy. The firstorder conditions for an optimum are
3x^{2} − 3y = 0^{ } 3y^{2} − 3x = 0.^{ } Now, the Hessian of f at any point (x, y) is
H(x, y) = 6x −3 −3 6y . The function is plotted in the following figure.
 Example

Consider the function f(x, y) = 8x^{3} + 2xy − 3x^{2} + y^{2} + 1. We have
f'_{1}(x, y) = 24x^{2} + 2y − 6x f'_{2}(x, y) = 2x + 2y. 48x − 6 2 2 2 (x*, y*) = (0, 0) and (x**, y**) = (1/3, −1/3).Now look at the secondorder condition. We have
f"_{11}(x, y) = 48x − 6, f"_{22}(x, y) = 2, and f"_{12}(x, y) = 2.Now look at each stationary point in turn:
 (x*, y*) = (0, 0)
 We have f"_{11}(0, 0) = −6 < 0 and
f"_{11}(0, 0)f"_{22}(0, 0) − (f"_{12}(0, 0))^{2} = −16 < 0.So (x*, y*) = (0, 0) is neither a local maximizer nor a local minimizer (i.e. it is a saddle point).
 (x**, y**) = (1/3, −1/3)
 We have f"_{11}(1/3, −1/3) = 10 > 0 and
f"_{11}(1/3, −1/3)f"_{22}(1/3, −1/3) − (f"_{12}(1/3, −1/3))^{2} = 96/3 − 16 = 16 > 0.So (x**, y**) = (1/3, −1/3) is a local minimizer. The minimum value of the function is f(1/3, −1/3) = 23/27.