Viewing a single comment thread. View all comments

chogall t1_j9nioqb wrote

2

Seankala OP t1_j9np9ae wrote

I guess at least 100M+ parameters? I like to think of the BERT-base model as being the "starting point" of LLMs.

3

FluffyVista t1_j9ottk1 wrote

probably

1

Yahentamitsi t1_j9xi4q4 wrote

That's a good question! I'm not sure if there are any pretrained language models with fewer parameters, but you could always try training your own model from scratch and see how small you can get it.

1