Solved – steepest descent? Is it gradient descent with exact line search

gradient descentmachine learningoptimizationterminology

I am confused on the definitions of steepest descent.

  • In some literature, such as this and this, steepest descent means using negative gradient direction and exact line search on that direction.
  • But in this note, It seems as well as we are following negative gradient, the method can be called steepest descent.

Which one is correct? Can we call using fixed alpha (without line search) in negative gradient direction steepest descent?

Is the term "steepest descent" loosely defined?

Best Answer

Steepest descent is a special case of gradient descent where the step length is chosen to minimize the objective function value. Gradient descent refers to any of a class of algorithms that calculate the gradient of the objective function, then move "downhill" in the indicated direction; the step length can be fixed, estimated (e.g., via line search), or ... (see this link for some examples).

Gradient-based optimization is, as Cliff AB points out in comments to the OP, more general still, referring to any method that uses gradients to optimize a function. Note that this does not mean you necessarily move in the direction that would be indicated by the gradient (see, for example, Newton's method.)

Related Question