Problem with descent direction and symmetric and positive definite matrix

gradient descentnumerical methodsnumerical optimizationoptimization

I am having a lot of problems with this exercise.

Can someone help me?

Let $n \in \mathbb{N}$ and $f: \mathbb{R}^n \rightarrow \mathbb{R}$ be continuously differentiable. To minimize $f$ with a descent method, let $d_M^{(k)}$ be the search direction defined by $Md_M^{(k)}=-\nabla f(x^{(k)})$ with a symmetric and positive definite matrix $M\in\mathbb{R}^{n*n} $.

I have to show that $d_M^{(k)}$ is a descent direction if $\nabla f(x^{(k)}) \neq 0$

Best Answer

If $M \in \mathbb{R}^{n \times n}$ is a positive definite matrix, we have that $d^{T} M d > 0$, for all $d \in \mathbb{R}^{n}$ with $d \not = 0$. First, being a positive definite matrix, there is no non-trivial solution for $M d = 0$, otherwise $d^{T} M d = d^{T} 0 = 0$ for a non-zero $d$. In particular, the system $M d = - \nabla f (x^{(k)})^{T}$ have a solution, since all squares matrices that do not assume non-trivial solutions for the homogeneous system of linear equations associated with them are surjective and injective. Thus, letting $d^{(k)}_{M}$ be the non-zero vector so that $ M d^{(k)}_{M} =- \nabla f (x^{(k)})^{T},$ we have $$0<\left(d^{(k)}_{M} \right)^{T} M d^{(k)}_{M} =- \nabla f (x^{(k)})^{T} d^{(k)}_{M}.$$ Thus, $0> \nabla f (x^{(k)})^{T} d^{(k)}_{M}.$ Hence, $ d^{(k)}_{M} $ is a descent direction for $f$.