Viewing a single comment thread. View all comments

thomasahle t1_je14a0c wrote

Are there any "small" LLMs, like 1MB, that I can include, say, on a website using ONNX to provide a minimal AI chat experience?

2

thedamian t1_je5eweg wrote

Before answering the question, I would submit that you should be thinking of keeping your models behind an api. No need to have it sitting on the client side (which is why it feels you're asking the quesiton)

And behind an API it can be as big as you'd like or can afford on your server)

2