Submitted by minimaxir t3_11fbccz in MachineLearning
sebzim4500 t1_jan01xr wrote
Reply to comment by Timdegreat in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
Would you even want to? Sounds like overkill to me, but maybe I am missing some use case of the embeddings.
Timdegreat t1_jan7sel wrote
You can use the embeddings to search through documents. First, create embeddings of your documents. Then create an embedding of your search query. Do a similarity measurement between the document embeddings and the search embedding. Surface the top N documents.
sebzim4500 t1_jan85s7 wrote
Yeah, I get that's that embeddings are used for semantic search but would you really want to use a model as big as ChatGPT to compute the embeddings? (Given how cheap and effective Ada is)
Timdegreat t1_jangbi7 wrote
You got a point there! I haven't given it too much thought really -- I def need to check out ada.
But wouldn't the ChatGPT embeddings still be better? Given that they're cheap, why not use the better option?
farmingvillein t1_japqcq1 wrote
> But wouldn't the ChatGPT embeddings still be better? Given that they're cheap, why not use the better option?
Usually, to get the best embeddings, you need to train them somewhat differently than you do a "normal" LLM. So ChatGPT may not(?) be "best" right now, for that application.
Viewing a single comment thread. View all comments