Submitted by TobusFire t3_11fil25 in MachineLearning
filipposML t1_jamongz wrote
Reply to comment by M_Alani in [D] Are Genetic Algorithms Dead? by TobusFire
We recently published an evolutionary method to sample from the latent space of a variational autoencoder. It is still alive and well. Just a bit niche.
mmmniple t1_jan7i1a wrote
It sounds very interesting. Is it available to read? Thanks
filipposML t1_jaopq43 wrote
The latest version is here: https://2022.ecmlpkdd.org/wp-content/uploads/2022/09/sub_1229.pdf
mmmniple t1_jaopv6r wrote
Thanks
filipposML t1_jaq6whb wrote
Cheers
avialex t1_janjx6r wrote
Appears to be here: https://openreview.net/forum?id=ibNr25jJrf
edit: actually after reading it, I don't think this is the referenced publication, but it's still interesting
mmmniple t1_jao9iuk wrote
Thanks
filipposML t1_jaooo3f wrote
Hey, this is it actually! We are optimizing a discrete variational autoencoder with no gumbel-softmax trick.
filipposML t1_jaop5vw wrote
Of course we require no encoding model, so the notion of a latent space only holds up until closer inspection.
avialex t1_jap04wq wrote
I was kinda excited, I had hoped to find an evolutionary algorithm to find things in a latent space, I've been having a hell of a time trying to optimize text encodings for diffusion models.
filipposML t1_jaq6tpq wrote
You just need a notion of a fitness function and then you can apply permutations to the tokens.
Viewing a single comment thread. View all comments