7.1 Optimization with inequality constraints: the KuhnTucker conditions
Consider, for example, a consumer's choice problem. There is no reason to insist that a consumer spend all her wealth. To allow her not to spend it all, we can formulate her optimization problem with inequality constraints:
One approach to solving this problem starts by determining which of these two conditions holds at a solution. In more complex problems, with more than one constraint, this approach does not work well. Consider, for example, a consumer who faces two constraints (perhaps money and time). Three examples are shown in the following figure, which should convince you that we cannot deduce from simple properties of u alone which of the constraints, if any, are satisfied with equality at a solution.
We consider a problem of the form
All the problems we have studied so far may be put into this form.
 Equality constraints
 We introduce two inequality constraints for every equality constraint. For example, the problem
max_{x} f(x) subject to h(x) = 0may be written asmax_{x} f(x) subject to h(x) ≤ 0 and −h(x) ≤ 0.
 Nonnegativity constraints
 For a problem with a constraint x_{k} ≥ 0 we let g_{j}(x) = −x_{k} and c_{j} = 0 for some j.
 Minimization problems
 For a minimization problem we multiply the objective function by −1:
min_{x} h(x) subject to g_{j}(x) ≤ c_{j} for j = 1, ..., mis the same asmax_{x} f(x) subject to g_{j}(x) ≤ c_{j} for j = 1, ..., m,where f(x) = −h(x).
In each figure the solution of the problem is the point x*. In the first figure the constraint binds at the solution: a change in c changes the solution. In the second figure, the constraint is slack at the solution: small changes in c have no effect on the solution.
As before, define the Lagrangean function L by
 if g(x*) = c (as in the lefthand panel) and the constraint satisfies a regularity condition, then L'_{i}(x*) = 0 for all i
 if g(x*) < c (as in the righthand panel), then f_{i}'(x*) = 0 for all i.
In the second case, the value of λ does not enter the conditions, so we can choose any value for it. Given the interpretation of λ, setting λ = 0 makes sense. Under this assumption we have f'_{i}(x) = L'_{i}(x) for all x, so that L'_{i}(x*) = 0 for all i. Thus in both cases we have L'_{i}(x*) = 0 for all i, λ ≥ 0, and g(x*) ≤ c. In the first case we have g(x*) = c and in the second case λ = 0.
We may combine the two cases by writing the conditions as
=  0 for j = 1, ..., n  

≤  c, and either λ = 0 or g(x*) − c = 0. 
=  0 for j = 1, ..., n  

≤  c, and λ[g(x*) − c] = 0. 
Note that the conditions do not rule out the possibility that both λ = 0 and g(x*) = c.
The condition that either (i) λ = 0 and g(x*) ≤ c or (ii) λ ≥ 0 and g(x*) = c is called a complementary slackness condition.
For a problem with many constraints, then as before we introduce one multiplier for each constraint and obtain the KuhnTucker conditions, defined as follows.
 Definition

Let f and g_{j} for j = 1, ..., m be differentiable functions of n variables and let c_{j} for j = 1, ..., m be numbers. Define the function L of n variables by
L(x) = f(x) − ∑m
j=1λ_{j}(g_{j}(x) − c_{j}) for all x.max_{x} f(x) subject to g_{j}(x) ≤ c_{j} for j = 1, ..., mareL'_{i}(x) = 0 for i = 1, ..., n λ_{j} ≥ 0, g_{j}(x) ≤ c_{j} and λ_{j}[g_{j}(x) − c_{j}] = 0 for j = 1, ..., m.
On the following pages I discuss results that specify the precise relationship between the solutions of the KuhnTucker conditions and the solutions of the problem. The following example illustrates the conditions for a specific problem.
 Example

Consider the problem
max_{x1, }_{x2} [−(x_{1} − 4)^{2} − (x_{2} − 4)^{2}] subject to x_{1} + x_{2} ≤ 4 and x_{1} + 3x_{2} ≤ 9,illustrated in the following figure.
We have
L(x_{1}, x_{2}) = −(x_{1} − 4)^{2} − (x_{2} − 4)^{2} − λ_{1}(x_{1} + x_{2} − 4) − λ_{2}(x_{1} + 3x_{2} − 9). −2(x_{1} − 4) − λ_{1} − λ_{2} = 0 −2(x_{2} − 4) − λ_{1} − 3λ_{2} = 0 x_{1} + x_{2} ≤ 4, λ_{1} ≥ 0, and λ_{1}(x_{1} + x_{2} − 4) = 0 x_{1} + 3x_{2} ≤ 9, λ_{2} ≥ 0, and λ_{2}(x_{1} + 3x_{2} − 9) = 0.