Viewing a single comment thread. View all comments

bushrod t1_j8m68xt wrote

This technique is similar to data augmentation, but with a specific focus on important samples. There may not be a specific name for this technique, but it could be considered a form of "strategic oversampling" or "strategic repetition" of important samples. By repeating these important samples in every batch, you are increasing their impact on the training process and potentially helping the neural network to converge to a better solution that takes these samples into account.

It's worth noting that this technique may not always be appropriate or necessary, and it could potentially lead to overfitting if not used carefully. However, in cases where there are a small number of important samples that have a disproportionate impact on the end application, repeating them in every batch can be a useful approach to ensure that the neural network learns to incorporate their information effectively.

:-P

1