[Math] Why must the gradient vector always be directed in an increasing direction

calculusderivativesmultivariable-calculus

I don't understand why it is not possible for some function, say for example $f(x,y)$, to not have some point where the partial derivatives combine to a gradient vector that points in a decreasing direction. Why must it always be negated to get a decreasing direction vector?

Best Answer

Intuitively, $f(x + \Delta x) \approx f(x) + \langle \nabla f(x), \Delta x \rangle$. (I'm using the convention that $\nabla f(x)$ is a column vector.) So if $\Delta x = \epsilon \nabla f(x)$ (here $\epsilon > 0$ is tiny), then \begin{align*} f(x + \Delta x) & \approx f(x) + \epsilon \langle \nabla f(x), \nabla f(x) \rangle \\ &= f(x) + \epsilon \| \nabla f(x) \|^2 \\ &\geq f(x). \end{align*}

So when we move a bit in the direction of $\nabla f(x)$, the value of $f$ increases.