[Math] Hessian matrix as derivative of gradient

definitionderivativesmatricespartial derivative

From a text:

For a real-valued differentiable function $f:\mathbb{R}^n\rightarrow\mathbb{R}$, the Hessian matrix $D^2f(x)$ is the derivative matrix of the vector-valued gradient function $\nabla f(x)$; i.e., $D^2f(x)=D[\nabla f(x)]$.

$\nabla f(x)$ is just an $n\times 1$ matrix consisting of $\partial f/\partial x_1,\partial f/\partial x_2,\ldots,\partial f/\partial x_n$.

Then $D[\nabla f(x)]$ must be a $1\times n$ matrix.

But I know that the Hessian matrix is an $n\times n$ matrix consisting of $\partial ^2f/\partial x_i\partial x_j$. How can the given definition be consistent with this?

Best Answer

The line "Then $D[\nabla f(x)]$ must be a $1\times n$ matrix" is where your confusion lies.

The derivative operator $D$ applied to a vector gives us how each component changes with each direction. Being more explicit with the notation we have

$$\begin{align}\nabla f(\mathbf x) &= D[f (\mathbf x)]\\ &= \left(\frac{\partial f}{\partial x_1}, \ldots, \frac{\partial f}{\partial x_n}\right)\end{align}$$

Now think of applying $D$ to each element of this vector individually;

$$\begin{align}D[\nabla f(\mathbf x)] &= D[D[f(\mathbf x)]]\\ &=\left(D\left[\frac{\partial f}{\partial x_1}\right]^T, \ldots, D\left[\frac{\partial f}{\partial x_n}\right]^T\right)\end{align}$$ Which expands to give us the Hessian matrix $$D^2[f(\mathbf x)]=\left(\begin{matrix}\frac{\partial^2 f}{\partial x_1^2} & \ldots & \frac{\partial^2 f}{\partial x_1\partial x_n}\\ \vdots & \ddots & \vdots \\ \frac{\partial^2 f}{\partial x_n\partial x_1}& \ldots & \frac{\partial^2 f}{\partial x_n^2}\end{matrix}\right)$$ which is indeed $n\times n$.

Related Question