7.3 Optimization with inequality constraints: the sufficiency of the KuhnTucker conditions
Concave objective function
 Proposition (Sufficiency of KuhnTucker conditions for concave objective function)

Let f and g_{j} for j = 1, ..., m be continuously differentiable functions of many variables and let c_{j} for j = 1, ..., m be constants. Suppose that
 f is concave
 and g_{j} is quasiconvex for j = 1, ..., m.
max_{x}f(x) subject to g_{j}(x) ≤ c_{j} for j = 1,...,m.
 Proof

The fact that (x*, λ) satisfies the KuhnTucker conditions means that
f'_{i}(x*) − ∑m
j=1λ_{j}g'_{ji}(x*) = 0 for i = 1, ..., nλ_{j} ≥ 0, g_{j}(x*) ≤ c_{j} and λ_{j}[g_{j}(x*) − c_{j}] = 0 for j = 1, ..., m Let x be such that g_{j}(x) ≤ c_{j} for all j. We need to show that f(x) ≤ f(x*).
Let j be an index such that g_{j}(x*) = c_{j}. (That is, the jth constraint is binding at x*.) Then g_{j}(x) ≤ g_{j}(x*), so that by a characterization of quasiconvex functions, ∑n
i=1g'_{ji}(x*)·(x_{i} − x*_{i}) ≤ 0. Thus given λ_{j} ≥ 0 we have λ_{j}∑n
i=1g'_{ji}(x*)·(x_{i} − x*_{i}) ≤ 0.Now let j be an index such that g_{j}(x*) < c_{j}. Then by the complementary slackness condition for constraint j in the KuhnTucker conditions, λ_{j} = 0.
Thus for every constraint j we have
λ_{j}∑n
i=1g'_{ji}(x*)·(x_{i} − x*_{i}) ≤ 0∑m
j=1λ_{j}∑n
i=1g'_{ji}(x*)·(x_{i} − x*_{i}) ≤ 0,∑n
i=1∑m
j=1λ_{j}g'_{ji}(x*)·(x_{i} − x*_{i}) ≤ 0.∑n
i=1f'_{i}(x*)·(x_{i} − x*_{i}) − ∑n
i=1∑m
j=1λ_{j}g'_{ji}(x*)·(x_{i} − x*_{i}) = 0.∑n
i=1f'_{i}(x*)·(x_{i} − x*_{i}) ≤ 0.
 Corollary (Necessity and sufficiency of KuhnTucker conditions for concave objective function)

Let f and g_{j} for j = 1, ..., m be continuously differentiable functions of many variables and let c_{j} for j = 1, ..., m be constants. Suppose that f is
concave and
 either each g_{j} is linear
 or each g_{j} is convex and there exists x such that g_{j}(x) < c_{j} for j = 1, ..., m.
max_{x}f(x) subject to g_{j}(x) ≤ c_{j} for j = 1, ..., mif and only if it satisfies the KuhnTucker conditions.
Quasiconcave objective function
The condition that the objective function be concave is a bit too strong to be useful in some economic applications. For example, the assumption we would like to impose on a consumer's utility function is that it be quasiconcave. The next result is useful in this case. Proposition (Sufficiency of KuhnTucker conditions for quasiconcave objective function)

Let f and g_{j} for j = 1, ..., m be continuously differentiable functions of n variables and let c_{j} for j = 1, ..., m be constants. Suppose that
 f is quasiconcave
 and g_{j} is quasiconvex for j = 1,...,m.
max_{x}f(x) subject to g_{j}(x) ≤ c_{j} for j = 1,...,m.
 Proof

The proof of the previous result uses the concavity of f only in the last sentence. In particular, under the assumption that f is quasiconcave we can conclude that
∑n
i=1f'_{i}(x*)·(x_{i} − x*_{i}) ≤ 0,Denote the vector of partial derivatives of f by ∇f. Given the continuity of f, the fact that f(x) > f(x*) means that there exists t > 0 (small enough) such that f(x − t∇f(x*)) > f(x*), and the conclusion from the proof of the previous result may be written
∇f(x*)·(x − x*) ≤ 0. ∇f(x*)·(x − t∇f(x*) − x*) = −t∇f(x*)·∇f(x*) + ∇f(x*)(x − x*) ≤ −t∑n
i=1(f'_{i}(x*))^{2} < 0,
 Corollary (Necessity and sufficiency of KuhnTucker conditions for quasiconcave objective function)

Let f and g_{j} for j = 1, ..., m be continuously differentiable functions of n variables and let c_{j} for j = 1, ..., m be constants. Suppose that f is
quasiconcave and each g_{j} is linear.
If x* solves the problem
max_{x}f(x) subject to g_{j}(x) ≤ c_{j} for j = 1, ..., mthen there exists a unique vector λ such that (x*, λ) satisfies the KuhnTucker conditions, and if (x*, λ) satisfies the KuhnTucker conditions and f'_{i}(x*) ≠ 0 for i = 1, ..., n then x* solves the problem.
Examples
The next two very simple examples illustrate how to use the KuhnTucker conditions. Example

Consider the problem
max_{x}[−(x − 2)^{2}] subject to x ≥ 1,illustrated in the following figure.
Written in the standard format, this problem is
max_{x}[−(x − 2)^{2}] subject to 1 − x ≤ 0.The objective function is concave and the constraint is linear. Thus the KuhnTucker conditions are both necessary and sufficient: the set of solutions of the problem is the same as the set of solutions of the KuhnTucker conditions.The KuhnTucker conditions are
−2(x − 2) + λ = 0 x−1 ≥ 0, λ ≥ 0, and λ(1 − x) = 0.  x = 1: 2 + λ = 0, or λ = −2, which violates λ ≥ 0.
 λ = 0: −2(x − 2) = 0; the only solution is x = 2.
 Example

Consider the problem
max_{x}[−(x − 2)^{2}] subject to x ≥ 3,illustrated in the following figure.
Written in the standard format, this problem is
max_{x}[−(x − 2)^{2}] subject to 3 − x ≤ 0.As in the previous example, the objective function is concave and the constraint function is linear, so that the set of solutions of the problem is the set of solutions of the KuhnTucker conditions.The KuhnTucker conditions are
−2(x−2) + λ = 0 x−3 ≥ 0, λ ≥ 0, and λ(3 − x) = 0.  x = 3: −2 + λ = 0, or λ = 2.
 λ = 0: −2(x − 2) = 0; since x ≥ 3 this has no solution compatible with the other conditions.
 Example

Consider the problem
max_{x1, }_{x2} [−(x_{1} − 4)^{2} − (x_{2} − 4)^{2}] subject to x_{1} + x_{2} ≤ 4 and x_{1} + 3x_{2} ≤ 9.The objective function is concave and the constraints are both linear, so the solutions of the problem are the solutions of the KuhnTucker conditions.
We previously found that the KuhnTucker conditions for this problem are
−2(x_{1} − 4) − λ_{1} − λ_{2} = 0 −2(x_{2} − 4) − λ_{1} − 3λ_{2} = 0 x_{1} + x_{2} ≤ 4, λ_{1} ≥ 0, and λ_{1}(x_{1} + x_{2} − 4) = 0 x_{1} + 3x_{2} ≤ 9, λ_{2} ≥ 0, and λ_{2}(x_{1} + 3x_{2} − 9) = 0.  x_{1} + x_{2} = 4 and x_{1} + 3x_{2} = 9: In this case we have x_{1} = 3/2 and x_{2} = 5/2. Then the first two equations are
5 − λ_{1} − λ_{2} = 0 3 − λ_{1} − 3λ_{2} = 0  x_{1} + x_{2} = 4 and x_{1} + 3x_{2} < 9, so that λ_{2} = 0: Then first two equations imply x_{1} = x_{2} = 2 and λ_{1} = 4. All the conditions are satisfied, so (x_{1}, x_{2}, λ_{1}, λ_{2}) = (2, 2, 4, 0) is a solution.
 x_{1} + x_{2} < 4 and x_{1} + 3x_{2} = 9, so that λ_{1} = 0: Then the first two equations imply x_{1} = 33/10 and x_{2} = 19/10, violating x_{1} + x_{2} < 4.
 x_{1} + x_{2} < 4 and x_{1} + 3x_{2} < 9, so that λ_{1} = λ_{2} = 0: Then first two equations imply x_{1} = x_{2} = 4, violating x_{1} + x_{2} < 4.
 x_{1} + x_{2} = 4 and x_{1} + 3x_{2} = 9: In this case we have x_{1} = 3/2 and x_{2} = 5/2. Then the first two equations are
 Example

Consider the problem
max_{x,y} xy subject to x + y ≤ 6, x ≥ 0, and y ≥ 0.The objective function is twicedifferentiable and quasiconcave and the constraint functions are linear, so the KuhnTucker conditions are necessary and if ((x*, y*), λ*) satisfies these conditions and no partial derivative of the objective function at (x*, y*) is zero then (x*, y*) solves the problem. Solutions of the KuhnTucker conditions at which all derivatives of the objective function are zero may or may not be solutions of the problem—we need to check the values of the objective function at these solutions.
(Alternatively we can argue as follows. The constraint functions are linear, so the KuhnTucker conditions are necessary. Further, the objective function is continuous and the constraint set is compact, so by the extreme value theorem the problem has a solution. Thus the solutions of the problem are the solutions of the firstorder conditions that yield the highest values for the function.)
The Lagrangean is
L(x, y) = xy − λ_{1}(x + y − 6) + λ_{2}x + λ_{3}y.The KuhnTucker conditions arey − λ_{1} + λ_{2} = 0 x − λ_{1} + λ_{3} = 0 λ_{1} ≥ 0, x + y ≤ 6, λ_{1}(x + y − 6) = 0 λ_{2} ≥ 0, x ≥ 0, λ_{2}x = 0 λ_{3} ≥ 0, y ≥ 0, λ_{3}y = 0.  If x > 0 and y > 0 then λ_{2} = λ_{3} = 0, so that λ_{1} = x = y from the first two conditions. Hence x = y = λ_{1} = 3 from the third condition. These values satisfy all the conditions.
 If x = 0 and y > 0 then λ_{3} = 0 from the last condition and hence λ_{1} = x = 0 from the second condition. But now from the first condition λ_{2} = −y < 0, contradicting λ_{2} ≥ 0.
 If x > 0 and y = 0 then λ_{2} = 0, and a symmetric argument yields a contradiction.
 If x = y = 0 then λ_{1} = 0 form the third set of conditions, so that λ_{2} = λ_{3} from the first and second conditions. These values satisfy all the conditions.