lukeiy t1_j2luz7z wrote on January 2, 2023 at 7:31 AM Reply to [D] What are good ways of incorporating non-sequential context into a transformer model? by abc220022 Use another model to reduce this context to a vector, then append it to each token. This was the process used in Set Transformers (TSPN) Permalink 2
lukeiy t1_j2luz7z wrote
Reply to [D] What are good ways of incorporating non-sequential context into a transformer model? by abc220022
Use another model to reduce this context to a vector, then append it to each token. This was the process used in Set Transformers (TSPN)