Viewing a single comment thread. View all comments

sebzim4500 t1_jan85s7 wrote

Yeah, I get that's that embeddings are used for semantic search but would you really want to use a model as big as ChatGPT to compute the embeddings? (Given how cheap and effective Ada is)

2

Timdegreat t1_jangbi7 wrote

You got a point there! I haven't given it too much thought really -- I def need to check out ada.

But wouldn't the ChatGPT embeddings still be better? Given that they're cheap, why not use the better option?

1

farmingvillein t1_japqcq1 wrote

> But wouldn't the ChatGPT embeddings still be better? Given that they're cheap, why not use the better option?

Usually, to get the best embeddings, you need to train them somewhat differently than you do a "normal" LLM. So ChatGPT may not(?) be "best" right now, for that application.

2