Solved – When are genetic algorithms a good choice for optimization

genetic algorithmsgradient descentmachine learningoptimization

Genetic algorithms are one form of optimization method. Often stochastic gradient descent and its derivatives are the best choice for function optimization, but genetic algorithms are still sometimes used. For example, the antenna of NASA's ST5 spacecraft was created with a genetic algorithm:

ST5 antenna

When are genetic optimization methods a better choice than more common gradient descent methods?

Best Answer

Genetic algorithms (GA) are a family of heuristics which are empirically good at providing a decent answer in many cases, although they are rarely the best option for a given domain.

You mention derivative-based algorithms, but even in the absence of derivatives there are plenty of derivative-free optimization algorithms that perform way better than GAs. See this and this answer for some ideas.

What many standard optimization algorithms have in common (even derivative-free methods) is the assumption that the underlying space is a smooth manifold (perhaps with a few discrete dimensions), and the function to optimize is somewhat well-behaved.

However, not all functions are defined on a smooth manifold. Sometimes you want to optimize over a graph or other discrete structures (combinatorial optimization) -- here there are dedicated algorithms, but GAs would also work.

The more you go towards functions defined over complex, discrete structures, the more GAs can be useful, especially if you can find a representation in which the genetic operators work at their best (which requires a lot of hand-tuning and domain knowledge).

Of course, the future might lead to forget GAs altogether and develop methods to map discrete spaces to continuous space, and use the optimization machinery we have on the continuous representation.

Related Question