Thank you for your comment. The author of the tutorial has been notified.
2.4 Differentials and comparative statics
Introduction
We may use the tool of implicit differentiation to study the dependence of a variable x on a list p of parameters when the variable is defined by an equation like f(x, p) = 0. Many models in economic theory involve several variables that satisfy several equations simultaneously. In such cases, another
(closely related) method is useful.
Suppose that we have two variables, x and y, and two parameters, p and q, and for any values of p and q the values of the variables satisfy the two equations
f(x, y, p, q)
= 0
g(x, y, p, q)
= 0.
These two equations implicitly define x and y as functions of p and q. As in the case of a single equation, two questions arise:
Do the equations have solutions for x and y for any given values of p and q?
How do the solutions change as p or q, or possibly both, change?
Existence of a solution
We have seen that even a single equation in a single variable may not have a solution, but that if it does, we may be able to use the Intermediate Value Theorem to show that it does. Generalizations of the Intermediate Value Theorem, which I do not discuss, can be helpful in showing that a collection of equations in many variables has a solution.
A useful rough guideline for a set of equations to have a unique solution is that the number of equations be equal to the number of variables. This condition definitely is neither necessary nor sufficient, however. For example, the single equation x^{2} = −1 in a single variable has no solution, while the single equation x^{2} +
y^{2} = 0 in two variables has a unique solution ((x, y) = (0, 0)). But there is some presumption that if the condition is satisfied and the equations are all “independent” of each other, the system is likely to have a unique solution, and if it is not satisfied there is little chance that it has a unique solution.
Differentials
Now consider the question of how a solution, if it exists, depends on the parameters. A useful tool to address this question involves the notion of a differential.
Let f be a differentiable function of a single variable. If x increases by a small amount from a to a + Δx, by how much does f(x) increase? A differentiable function of a single variable is approximated around a by its tangent at a. Thus if Δx is very
small then the approximate increase in f(x) is
f'(a)Δx
(where f'(a) is of course the derivative of f at a).
For any change dx in x we define the differential of f(x) as follows.
By the argument above, if dx is small then the differential f'(x)dx is approximately equal to the change in the value of f when its argument increases or decreases by dx from x.
If f is a function of two variables, it is approximated by its tangent plane: for (x, y) close to (a, b) the approximate increase in f(x, y) when x changes by Δx and y changes by Δy is
f'_{1}(a, b)Δx + f'_{2}(a, b)Δy.
For a function of many variables, the differential is defined as follows.
Definition
Let f be a function of n variables. For any real numbers dx_{1}, ..., dx_{n}, the differential of f(x_{1}, ..., x_{n}) is
As in the case of a function of a single variable, if dx_{i} is small for each i = 1, ..., n, then the differential f'_{1}(x_{1}, ..., x_{n})dx_{1} + ... +
f'_{n}(x_{1}, ..., x_{n})dx_{n} is approximately equal to the change in the value of f when each argument x_{i} changes by dx_{i}.
To find a differential we may simply find the partial derivatives with respect to each variable in turn. Alternatively we can use a set of rules that are analogous to those for derivatives. Denoting the differential of the function f by d(f), we have:
d(af + bg)
=
adf + bdg
d(f·g)
=
gdf + fdg
d(f/g)
=
(gdf − fdg)/g^{2}
if z = g(f(x, y))
then dz = g'(f(x, y))df
Comparative statics
Start with the simplest case: a single equation in one variable and one parameter:
f(x, p) = 0 for all x,
where x is the variable and p the parameter. We have previously seen how to use implicit differentiation to find the rate of change of x with respect to p. We may reach the same conclusion using differentials. The differential of the left-hand side of the equation is
f'_{1}(x, p)dx + f'_{2}(x, p)dp.
When p changes, the value of f(x, p) must remain the same for the equation f(x, p) = 0 to remain satisfied, so for small changes dx in x and dp in p we must have, approximately,
f'_{1}(x, p)dx + f'_{2}(x, p)dp = 0.
Rearranging this equation we have
dx
dp
= −
f'_{2}(x, p)
f'_{1}(x, p)
.
The entity on the left-hand side is the quotient of the small quantities dx and dp, not a derivative. However, we can in fact show that the right-hand side is the derivative of x with respect to p, as we found previously.
This technique may be extended to systems of equations. Suppose, for example, that the variables, x and y, satisfy the following two equations, where p and q are parameters, as in the opening section above:
f(x, y, p, q)
= 0
g(x, y, p, q)
= 0.
Assume that the functions f and g are such that the two equations define two solution functions
x*(p, q) and y*(p, q).
That is, f(x*(p, q), y*(p, q), p, q) = 0 and g(x*(p, q), y*(p, q), p, q) = 0 for all p and all q.
How do x* and y* depend on the parameters p and q? Assuming that the functions x* and y* are differentiable, we can answer this question by calculating the differentials of the functions on each side of the two equations defining them. If the changes in p and q are small, then the differentials must be equal, so that the equations defining
x* and y* remain satisfied. That is,
(To make these equations easier to read, I have omitted the arguments of the partial derivatives.)
To find the changes dx and dy in x and y necessary for these equations to be satisfied we need to solve the equations for dx and dy as functions of dp and dq, the changes in the parameters. (See the page on matrices and solutions of systems of simultaneous equations if you have
forgotten how.) We obtain
Now, to determine the impact on x and y of a change in p, holding q constant, we set dq = 0 to get
dx =
(−g'_{2} · f'_{3} +
f'_{2} · g'_{3}) · dp
f'_{1} · g'_{2} −
f'_{2} · g'_{1}
and
dy =
(g'_{1} · f'_{3} −
f'_{1} · g'_{3}) · dp
f'_{1} · g'_{2} −
f'_{2} · g'_{1}
.
We can alternatively write the first equation, for example, as
∂x
∂p
=
−g'_{2} · f'_{3} + f'_{2} · g'_{3}
f'_{1} · g'_{2} −
f'_{2} · g'_{1}
.
If we make some assumption about the signs of the partial derivatives of f and g, this expression may allow us to determine the sign of ∂x/∂p—that is, the direction in which the equilibrium value of x changes when the parameter p changes.
This technique allows us also to study the change in a variable when more than one parameter changes, as illustrated in the following economic example.
Example 2.4.1
Consider the macroeconomic model
Y
=
C + I + G
C
=
f(Y − T)
I
=
h(r)
r
=
m(M)
where the variables are Y (national income), C (consumption), I (investment) and r (the rate of interest), and the parameters are M (the money supply), T (the tax burden), and G (government spending). We want to find how the variables change with the parameters.
Take differentials:
dY
= dC + dI + dG
dC
= f'(Y − T)(dY − dT)
dI
= h'(r)dr
dr
= m'(M)dM
We need to solve for dY, dC, dI, and dr in terms of dM, dT, and dG. The system is too big to use Cramer's rule easily, but the simple structure allows us to proceed step-by-step.
From the last two equations we have
dI = h'(r)m'(M)dM.
Now substitute for dI in the first equation to get
dY − dC
=
h'(r)m'(M)dM + dG
f'(Y − T)dY − dC
=
f'(Y − T)dT
You can solve this system for dY and dC. For example,
dY =
h'(r)m'(M)
1 − f'(Y − T)
dM −
f'(Y − T)
1 − f'(Y − T)
dT +
1
1 − f'(Y − T)
dG.
dY =
h'(r)m'(M)
1 − f'(Y − T)
dM
−
f'(Y − T)
1 − f'(Y − T)
dT
+
1
1 − f'(Y − T)
dG.
Thus, for example, if T changes while M and G remain constant (so that dM = dG = 0), then the rate of change of Y is given by
∂Y
∂T
=
−f'(Y − T)
1 −
f'(Y − T)
.
That is, if 0 < f'(z) < 1 for all z then Y decreases as T increases. Further, we can deduce that if T and G increase by equal (small) amounts (dT = dG) then the change in Y is
dY =
1 − f'(Y − T)
1 −
f'(Y − T)
dT = dT.
That is, an equal (small) increase in T and G leads to an increase in Y of the same size.