Viewing a single comment thread. View all comments

Dendriform1491 t1_jalb2vb wrote

Genetic algorithms require you to create a population where the genetic operators are applied (mutation, crossover and selection).

Creating a population of neural networks implies having multiple slightly different copies of the neural network to be optimized (i.e.: the population).

This can be more computationally expensive than other techniques which will do all the learning "in-place".

2

visarga t1_jalh1r1 wrote

You don't always need a population of neural networks, it could be a population of prompts or even a population of problem solutions.

If you're using GA to solve specific coding problems, then there is one paper where they use LLM to generate diffs for code. The LLM was the mutation operator, and they even fine-tune it iteratively.

3

-EmpiricalEvidence- t1_jan0mqg wrote

Exactly due to the computational demands I don't think genetic algorithms have ever really been "alive", but with compute getting cheaper I could see it seeing success similar to the rise of Deep Learning.

Evolution Strategies as stabilizers without the genetic component are already being deployed quite well e.g. AlphaStar.

Jeff Clune was quite active in that area of research and he recently joined DeepMind.

https://twitter.com/jeffclune/status/1629132544255070209

1