Viewing a single comment thread. View all comments

Tejalapeno t1_jdb3u06 wrote

Man it would be cool if the comments here actually focused on the paper contents and not the use of an acronym for an outdated algorithm. Because the results are extremely important for future scaling

−9

Armanoth t1_jdc2vt0 wrote

While the paper is good and definetly presents some novel approach. Re-using existing acronyms, especially such prominent ones. The main purpose of these acronyms to allow for readers to easily identify and reference existing methods.

If your choice of acronym forces all subsequent research to have to elaborate on which SIFT is mentioned, it is not only a poor choice but also a point of confusion. And existing papers that mention SIFT are retroactively affected.

As many in this thread has pointed out, there are other equally catchy, non-overlapping acronyms that could have been chosen.

5

pm_me_your_pay_slips t1_jdeyz79 wrote

Sure, my next paper will introduce Transformers, a new method for distillation of neural network models.

1