Viewing a single comment thread. View all comments

QuantumModulus t1_j673v5v wrote

LLMs fundamentally lack any understanding of reality or their inputs and outputs. Aggregating information is pretty vacuous when you have to sift through mountains of hay to find single needles.


wart_on_satans_dick t1_j67vcrm wrote

I didn't mean to suggest the technology is at that level now. I'm just naturally the opposite of a luddite lol.