Isn’t the Cauchy point just a linear exact search instead of a trust region method search

nonlinear optimizationnumerical methodsoptimization

I'm reading about trust region methods after reading about linear search methods.

A trust region method, as opposed to a linear search method, is supposed to search for a direction of greatest decrease, and then find a good step of decrease. For linear search, we specify a direction and then find a good step.

So, in the trust region method we must solve something like this:

$$\min f(x^k) + \nabla f(x^k)^Td +\frac{1}{2} d^T\nabla^2 f(x^k)d\\||d||<\Delta$$

as far as I understand, a method for solving this would find the direction of most decrease in the collection of all possible directions. But what my book does is simply this: it picks a direction (negative of the gradient) and minimizes the function in that direction, calling it the Cauchy Point.

Isn't it simply a linear exact search?

Best Answer

First you should understand the difference between the line search optimization method and trust region method.

In the first method we choose a direction for example the opposite of the gradient, then we choose a step size.

In the second optimization method i.e. the trust region method we choose the step size first, and we do that when we determine the raduis of the region, then we choose the descent direction. Hence the Cauchy Point it is the restriction of the point of the line search with steepest descent to the trust region.

Related Question