[Math] How to show that the Hessian matrix of $G$ is positive definite

multivariable-calculusnumerical linear algebranumerical methodsoptimization

Let $\{g_i:X\subset\mathbb{R}\rightarrow\mathbb{R};\;i=1,…,m\}$ be a linerly independet set of real functions.

Given $n$ points $(x_1,y_1),…,(x_n,y_n)\in X$, consider the following function

$$G(\beta_1,…,\beta_n)=\sum_{k=1}^n\left(\sum_{l=1}^m\left[ \beta_lg_l(x_k)-y_k\right]\right)^2$$

I need to prove that $G$ attains a local minimum. For this I need to show that the Hessian matrix of $G$ is positive definite, but I'm not able to do. I'm trying to show it, but I'm be able to prove that it's just positive semidefinite (for this, I'm doing things like this and this). Can someone help me to show that the Hessian matrix of $G$ is positive definite?

Notice that

$$\frac{\partial G}{\partial \beta_i}=\sum_{k=1}^n\left(2g_i(x_k)\sum_{l=1}^m\left[ \beta_lg_l(x_k)-y_k\right]\right)$$

and

$$\frac{\partial^2 G}{\partial \beta_j\beta_i}=2\sum_{k=1}^ng_i(x_k)g_j(x_k)$$

Hence, the hessian matrix of $G$ is

$$H=2
\begin{bmatrix}
\sum_{k=1}^ng_1(x_k)^2 & \sum_{k=1}^ng_1(x_k)g_2(x_k) & \cdots & \sum_{k=1}^ng_1(x_k)g_m(x_k)\\
\sum_{k=1}^ng_2(x_k)g_1(x_k) & \sum_{k=1}^ng_2(x_k)^2 & \cdots & \sum_{k=1}^ng_2(x_k)g_m(x_k)\\
& & \vdots & \\
\sum_{k=1}^ng_m(x_k)g_1(x_k) & \sum_{k=1}^ng_m(x_k)g_2(x_k) & \cdots & \sum_{k=1}^ng_m(x_k)^2
\end{bmatrix}$$

Thanks.

Best Answer

If you take the matrix $A$ with entries $ A_{kj} = g_j(x_k) $ then $A$ should be invertible from linear independence of $g_i$s and you have entries of Hessian assuming your calculation as $$ H_{ij} = \frac{\partial^2 G}{\partial \beta_i\partial \beta_j} = 2 \sum_{k=1}^n A^T_{ik} A_{kj} $$ Hence $ H = 2A^T A $ which is always positive definite for invertible $A$.

Related Question