levand

levand t1_j9qe7ev wrote

> These models are too small to truly overfit on their datasets.

I thought we were talking about 175 billion parameters, literally some of the biggest models in existence? Although it is true that at some point models get big enough that they become less prone to overfitting (and it's not clear why): https://openai.com/blog/deep-double-descent/

1

levand t1_j7o5zeb wrote

This is inherently a super hard problem, because (to oversimplify) the loss function of any AI generating NN is to minimize the difference between a human generated and AI generated images. So the state of the art for detection & generation is always going to be pretty close.

9