Submitted by bo_peng t3_1135aew in MachineLearning
farmingvillein t1_j8s7ygo wrote
Reply to comment by gwern in [R] RWKV-4 14B release (and ChatRWKV) - a surprisingly strong RNN Language Model by bo_peng
This...is pretty astounding. Just have the grace to admit you were wrong, and move on.
> Telling someone to read the Related Works section of every one of a dozen papers in the Related Works section of a paper is a ridiculous thing to suggest
Then how can you possibly say:
> I don't think the Related Works section of that paper provides any useful references.
?
This is hardcore trolling. You can, and frequently do, do better than this.
You are literally pushing posts that are factually incorrect, and that you either know are factually incorrect, or are too lazy to validate either way.
This is the type of thing which blows up post quality in this sub.
> Giving someone a random reference and telling them to manually crawl the literature is not helpful.
This...is ridiculous. This is--traditionally--a very academic-friendly sub. This is how research works. "Here is where you can start a literature review on a bundle of related papers" is an extremely classic response which is generally considered helpful to complex and nuanced questions.
And underlying issue is actually very complex, as evidenced in part by the fact that your references do not actually answer the question. "Go read related works" can be obnoxious when there are a single one or two papers that do answer the question--but that is not the case here.
> In contrast, the two references I provided directly bore on the question
No they did not. They did not touch at all upon Transformers versus RNNs, which was the question. You've chosen to cherry-pick one slice of the problem and declare victory.
> It's not a strawman.
You don't seem to understand what a strawman is. Strawman:
> an intentionally misrepresented proposition that is set up because it is easier to defeat than an opponent's real argument.
I was not making this argument. You were making this argument. QED, this a strawman.
Viewing a single comment thread. View all comments