Solved – Difference between “Hill Climbing” and “Gradient Descent”

gradient descentmachine learningterminology

I think the terminology is very confusing:

Are the "Hill Climbing" (in AI literature) and "Gradient Descent" (In machine learning literature) the same thing, other than one is maximizing a function and another is minimizing a function?

Best Answer

According to wikipedia they are not the same thing, although there is a similar flavor. Hill climbing refers to making incremental changes to a solution, and accept those changes if they result in an improvement. Note that hill climbing doesn't depend on being able to calculate a gradient at all, and can work on problems with a discrete input space like traveling salesman.

Related Question