[Math] Maximising with multiple constraints

optimization

I have $$Z=f(x_1 ,x_2 ,x_3 ,… ,x_n)$$ function and $$\left[\begin{array}{r}c_1=g_1(x_1 ,x_2 ,x_3 ,… ,x_n) \\c_2=g_2(x_1 ,x_2 ,x_3 ,… ,x_n)\\c_3=g_3(x_1 ,x_2 ,x_3 ,… ,x_n) \\…\\c_m=g_m(x_1 ,x_2 ,x_3 ,… ,x_n)
\end{array}\right]$$
constraints.
How can I know critical points are maximized or minimized with Hessian Matrix(bordered matrix)? I need to know it for use to solve numeric problems only not for prove.
In other word, how can I know that critical points (that had sound by first order condition. Setting derivatives of Lagrange function to zero ) are Maximum, Minimum or inflection points via Hessian Matrix?

Best Answer

I'm going to use the notation \begin{equation} f(x) := f(x_1, x_2, \ldots, x_n) \end{equation} and \begin{equation} g(x) = \left[\begin{array}{c} g_1(x) \\ \vdots \\ g_m(x) \end{array}\right] \end{equation} along with \begin{equation} c = \left[\begin{array}{c} c_1 \\ \vdots \\ c_m\end{array}\right] \end{equation} to represent all of your constraints compactly as \begin{equation} g(x) = c. \end{equation}

You are correct in forming the bordered Hessian matrix because this is a constrained problem. Note that the bordered Hessian differs from the Hessian used for unconstrained problems and takes the form

\begin{equation} H = \left[\begin{array}{ccccc} 0 & \frac{\partial g}{\partial x_1} & \frac{\partial g}{\partial x_2} & \cdots & \frac{\partial g}{\partial x_n} \\ \frac{\partial g}{\partial x_1} & \frac{\partial^2 f}{\partial x_1^2} & \frac{\partial^2 f}{\partial x_1 \partial x_2} & \cdots & \frac{\partial^2}{\partial x_1 \partial x_n} \\ \frac{\partial g}{\partial x_2} & \frac{\partial^2 f}{\partial x_2 \partial x_1} & \frac{\partial^2 f}{\partial x_2^2} & \cdots & \frac{\partial^2 f}{\partial x_2 \partial x_n} \\ \frac{\partial g}{\partial x_3} & \frac{\partial^2 f}{\partial x_3 \partial x_1} & \frac{\partial^2 f}{\partial x_3 \partial x_2} & \cdots & \frac{\partial^2 f}{\partial x_3 \partial x_n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \frac{\partial g}{\partial x_n} & \frac{\partial^2 f}{\partial x_n \partial x_1} & \frac{\partial^2 f}{\partial x_n \partial x_2} & \cdots & \frac{\partial^2 f}{\partial x_n^2} \end{array}\right] \end{equation} where the $0$ in the upper-left represents an $m\times m$ sub-matrix of zeros and we've added $m$ columns on the left and $m$ rows on top.

To determine if a point is a minimum or a maximum, we look at $n - m$ of the bordered Hessian's principal minors. First we examine the minor made up of the first $2m+1$ rows and columns of $H$ and compute its determinant. Then we look at the minor made up of the first $2m + 2$ rows and columns and compute its determinant. We do the same for the first $2m+3$ rows and columns, and we continue doing this until we compute the determinant of the bordered Hessian itself.

A sufficient condition for a local minimum of $f$ is to have all of the determinants we computed above have the same sign as $(-1)^m$. A sufficient condition for a local maximum of $f$ is that the determinant of the smallest minor have the same sign as $(-1)^{m-1}$ and that the determinants of the principal minors (in the order you computed them) alternate in sign.