kdqg
kdqg t1_j2oo4rl wrote
Reply to comment by lukeiy in [D] What are good ways of incorporating non-sequential context into a transformer model? by abc220022
Also have a look at the slot attention mechanism, which does something similar but arguably more elegantly
kdqg t1_j2onz1m wrote
Reply to comment by ai-lover in [D] What are good ways of incorporating non-sequential context into a transformer model? by abc220022
Did chatGPT write this
kdqg t1_j5xzfx4 wrote
Reply to [D] Self-Supervised Contrastive Approaches that don’t use large batch size. by shingekichan1996
VICReg