Consider, for example, a consumer's choice problem. There is no reason to insist that a consumer spend all her wealth. To allow her not to spend it all, we can formulate her optimization problem with inequality constraints:
One approach to solving this problem starts by determining which of these two conditions holds at a solution. In more complex problems, with more than one constraint, this approach does not work well. Consider, for example, a consumer who faces two constraints (perhaps money and time). Three examples are shown in the following figure, which should convince you that we cannot deduce from simple properties of u alone which of the constraints, if any, are satisfied with equality at a solution.
We consider a problem of the form
All the problems we have studied so far may be put into this form.
- Equality constraints
- We introduce two inequality constraints for every equality constraint. For example, the problem
maxx f(x) subject to h(x) = 0may be written asmaxx f(x) subject to h(x) ≤ 0 and −h(x) ≤ 0.
- Nonnegativity constraints
- For a problem with a constraint xk ≥ 0 we let gj(x) = −xk and cj = 0 for some j.
- Minimization problems
- For a minimization problem we multiply the objective function by −1:
minx h(x) subject to gj(x) ≤ cj for j = 1, ..., mis the same asmaxx f(x) subject to gj(x) ≤ cj for j = 1, ..., m,where f(x) = −h(x).
In each figure the solution of the problem is the point x*. In the first figure the constraint binds at the solution: a change in c changes the solution. In the second figure, the constraint is slack at the solution: small changes in c have no effect on the solution.
As before, define the Lagrangean function L by
- if g(x*) = c (as in the left-hand panel) and the constraint satisfies a regularity condition, then L'i(x*) = 0 for all i
- if g(x*) < c (as in the right-hand panel), then fi'(x*) = 0 for all i.
In the second case, the value of λ does not enter the conditions, so we can choose any value for it. Given the interpretation of λ, setting λ = 0 makes sense. Under this assumption we have f'i(x) = L'i(x) for all x, so that L'i(x*) = 0 for all i. Thus in both cases we have L'i(x*) = 0 for all i, λ ≥ 0, and g(x*) ≤ c. In the first case we have g(x*) = c and in the second case λ = 0.
We may combine the two cases by writing the conditions as
|=||0 for j = 1, ..., n|
||≤||c, and either λ = 0 or g(x*) − c = 0.|
|=||0 for j = 1, ..., n|
||≤||c, and λ[g(x*) − c] = 0.|
Note that the conditions do not rule out the possibility that both λ = 0 and g(x*) = c.
The condition that either (i) λ = 0 and g(x*) ≤ c or (ii) λ ≥ 0 and g(x*) = c is called a complementary slackness condition.
For a problem with many constraints, then as before we introduce one multiplier for each constraint and obtain the Kuhn-Tucker conditions, defined as follows.
Let f and gj for j = 1, ..., m be differentiable functions of n variables and let cj for j = 1, ..., m be numbers. Define the function L of n variables by
L(x) = f(x) − ∑m
j=1λj(gj(x) − cj) for all x.maxx f(x) subject to gj(x) ≤ cj for j = 1, ..., mare
L'i(x) = 0 for i = 1, ..., n λj ≥ 0, gj(x) ≤ cj and λj[gj(x) − cj] = 0 for j = 1, ..., m.
On the following pages I discuss results that specify the precise relationship between the solutions of the Kuhn-Tucker conditions and the solutions of the problem. The following example illustrates the conditions for a specific problem.
Consider the problem
maxx1, x2 [−(x1 − 4)2 − (x2 − 4)2] subject to x1 + x2 ≤ 4 and x1 + 3x2 ≤ 9,illustrated in the following figure.
L(x1, x2) = −(x1 − 4)2 − (x2 − 4)2 − λ1(x1 + x2 − 4) − λ2(x1 + 3x2 − 9). −2(x1 − 4) − λ1 − λ2 = 0 −2(x2 − 4) − λ1 − 3λ2 = 0 x1 + x2 ≤ 4, λ1 ≥ 0, and λ1(x1 + x2 − 4) = 0 x1 + 3x2 ≤ 9, λ2 ≥ 0, and λ2(x1 + 3x2 − 9) = 0.