Skip to article frontmatterSkip to article content

10. Partial derivatives

Faculty of Economics and Business
University of Zagreb

10.1Partial derivatives

In this section, we are going to learn how to find differentiate functions of several variables. The definition os such a derivative is analogous to the case of functions of one variable.

As we’ll see in the problems that follow, in pratice this means that the same rules and formulas that we have previously learned in case of functions of one variable apply in the case of functions of several variables as well.

10.2Higher-order partial derivatives

In the case of functions of one variable, finding the second derivative of a function was easy - you simply take the derivative of the derivative. In case of functions of several variables, this is no longer applicable because the phrase “derivative of the derivative” doesn’t have any reasonable meaning: which derivative are we going to differentiate and with respect to which variable?

All the second order partial derivatives of the function f(x,y).

All the second order partial derivatives of the function f(x,y).f(x,y).

All the second order partial derivatives of the function f(x,y,z).

All the second order partial derivatives of the function f(x,y,z).f(x,y,z).

10.3Hessian matrix

As we have seen in the previous chapter, if we are given a function of several variables, then there are more than one partial derivative of the second order so we can’t speak of a single “second derivative” of such a function. However, we would like to have a single mathematical object that contains all of the second-order partial derivatives of the function f,f, and the best candidate for such an object is a matrix. Therefore, we define the Hessian matrix of the function f(x1,,xn)\displaystyle f(x_1, \dots, x_n) as

Hf=[fx1,x1fx1,xnfxn,x1fxn,xn].H_f = \begin{bmatrix} f_{x_1, x_1} & \dots & f_{x_1, x_n} \\ \vdots & \dots & \vdots \\ f_{x_n, x_1} & \dots & f_{x_n, x_n} \end{bmatrix}.

For example, the Hessian matrix of a function f(x,y)f(x,y) of two variables is given by

Hf(x,y)=[fxxfxyfyxfyy].H_f (x,y) = \begin{bmatrix} f_{xx} & f_{xy} \\ f_{yx} & f_{yy} \end{bmatrix}.

The following theorem is of big importance, since it tells us that the Hessian matrix is symmetric. As a consequence, we won’t need to explicitly compute all of the second-order partial derivatives since some of the are going to be mutually equal.

In other words, the order in which you take higher-order partial derivatives does not matter.