Submitted by TobusFire t3_11fil25 in MachineLearning
TobusFire OP t1_jao0dm8 wrote
Reply to comment by drplan in [D] Are Genetic Algorithms Dead? by TobusFire
Interesting, thanks for sharing your thoughts! I'm a bit curious about why genetic algorithms might be better for these strange objective functions, as compared to something like simulated annealing. I can understand that a pure gradient method could easily be insufficient, but do the underlying components of genetic algorithms (like cross-over, etc.) really provide a distinct advantage here? Especially when the fitness is probably directly related to the gradient anyways
drplan t1_jav01pf wrote
I think the best approach for this is thinking about the search space and the fitness landscape. If different components of the solution vector can independently improve the fitness crossover operators will have a positive impact.
Another aspect is the search space itself. Is it real-valued, is it binary, is it a tree-like structure,..?
Traditionally genetic algorithms are operating on binary encodings, and they often work ok problem which have binary solutions (a fixed-size vector of bits). These problem do not have gradient to start with. However one should investigate beforehand if there are combinatorial approaches to solve the problem.
For real-valued problems with no gradient: evolution strategies with a smart mutation operation like CMA (covariance matrix adaption) would be a good choice.
Viewing a single comment thread. View all comments