Viewing a single comment thread. View all comments

ethereumturk t1_itj6hx2 wrote

SBert biencoder

3

fastglow t1_itlyz8a wrote

SBert is for comparing sentences/phrases. No reason to use that over a regular transformer encoder-decoder for language modelling, and getting that to process 2000 tokens in less than 4 seconds would be challenging without efficiency-augmenting methods like quantization, pruning, distillation, etc.

2