[D] What are good ways of incorporating non-sequential context into a transformer model? Submitted by abc220022 t3_100y331 on January 2, 2023 at 12:23 AM in MachineLearning 11 comments 27
bushcat89 t1_j2kzznq wrote on January 2, 2023 at 2:39 AM If I remember correctly Temporal Fusion Transformers tackle the problem of incorporating non sequential data into a transformer. Permalink 11
Viewing a single comment thread. View all comments