Viewing a single comment thread. View all comments

drplan t1_jalud65 wrote

Genetic algorithms are still useful for strange objective functions that defy analytical approaches, such as anything based on complex simulations. But it somehow has always been this way.

Nowadays things have changed by generative models for code generation. A few years ago Genetic Programming (and it's many variants) was the only approach to do this, now some problem can just be solved by asking a language model to write the code for xyz.

2

TobusFire OP t1_jao0dm8 wrote

Interesting, thanks for sharing your thoughts! I'm a bit curious about why genetic algorithms might be better for these strange objective functions, as compared to something like simulated annealing. I can understand that a pure gradient method could easily be insufficient, but do the underlying components of genetic algorithms (like cross-over, etc.) really provide a distinct advantage here? Especially when the fitness is probably directly related to the gradient anyways

1

drplan t1_jav01pf wrote

I think the best approach for this is thinking about the search space and the fitness landscape. If different components of the solution vector can independently improve the fitness crossover operators will have a positive impact.

Another aspect is the search space itself. Is it real-valued, is it binary, is it a tree-like structure,..?

Traditionally genetic algorithms are operating on binary encodings, and they often work ok problem which have binary solutions (a fixed-size vector of bits). These problem do not have gradient to start with. However one should investigate beforehand if there are combinatorial approaches to solve the problem.

For real-valued problems with no gradient: evolution strategies with a smart mutation operation like CMA (covariance matrix adaption) would be a good choice.

1