Submitted by abc220022 t3_100y331 in MachineLearning
farmingvillein t1_j2lnxdd wrote
Reply to comment by amnezzia in [D] What are good ways of incorporating non-sequential context into a transformer model? by abc220022
Or, for vectors, just slam it into the start of the sequence directly (use a normalization technique if you need to align dimensionality).
If you feel the need, place some sort of separator token ('###') between the "context features" and the input data.
Viewing a single comment thread. View all comments