Viewing a single comment thread. View all comments

LetterRip t1_j78ct6g wrote

It wouldn't matter. LaMDa has no volition, no goals, no planning. A crazy person acting on the belief that an AI is sentient, is no different than a crazy person acting due to hallucinating voices. It is their craziness that is the threat to society, not the AI. This makes the case that we shouldn't allow crazy people access to powerful tools.

Instead of an LLM suppose he said that Teddy Ruxpin was sentient and started doing things on behalf of Teddy Ruxpin

1

DoxxThis1 t1_j78sw7b wrote

Saying LaMDa has no volition is like saying the Nautilus can't swim. Correct, yet tangential to the bigger picture. Also a strawman argument, as I never claimed a specific current-day model is capable of such things. And the argument that a belief in AI sentience is no different from hallucinated voices misses the crucial distinction between the quantity, quality and persistence of the voices in question. Not referring to "today", but a doomsday scenario of uncontrolled AI proliferation.

1