Submitted by bil_sabab t3_118tztg in technology
jackmountion t1_j9jasgc wrote
Reply to comment by jackmountion in Inside the ChatGPT race in China by bil_sabab
could also be pretraining thought that's another theory that in pretraining data there leaks in some stuff from other languages. But I personally don't buy that it's simply not enough data. Maybe both theories are slightly right it's generalizing better than we thought but it needs so language context at first?
Viewing a single comment thread. View all comments