I saw a video today that said there's a limit on how much high quality data there is to train the models on so we will have to use other techniques to upgrade them from there. That's the only thing I can see slowing it down but I think we will find workarounds like we always do.
Viewing a single comment thread. View all comments