Viewing a single comment thread. View all comments

fhayde t1_j9qclzb wrote

Should we wait until a time when a conscious entity has existed with no regard or protection, and likely suffered at the hands of others, with no recourse or accountability, before we address the collective rights society can afford?

How many times are we going to have to learn that lesson before it sticks?

4

Dhiox t1_j9qd7os wrote

So what, we give Microsoft Word human rights on the chance it becomes self aware?

Trust me, researchers will be well aware if their tech gains self awareness, because that's basically the dream of every AI researcher. They will parade that news in the street's the moment they achieve that goal.

3

kharlos t1_j9slg66 wrote

We don't even apply this logic to many animals which are undoubtedly sentient, can suffer, and feel pain. We share recent common ancestors with many of these species and share zero with AI.

I'm not against granting AI rights in the future, but many animals will need to be granted rights before then, imo. I just think it's funny we're so anxious to treat something which feels no pain has no sentience (at least for a long time from now) with respect and as an equal when we are absolute monsters to everything else living on this planet.

Let's first treat humans and everything that suffers with some BASIC respect before moving on to the mental gymnastics required to do the same for language models.

3

Inn_Progress t1_j9qo5c8 wrote

Maybe let's first give rights to animals that are killed everyday so you could eat a steak and then we can talk about this scenario.

−1