Viewing a single comment thread. View all comments

MultiverseOfSanity t1_jdyy6gv wrote

There's also the issue of what would rights even look like for an AI? Ive seen enough sci-fi to understand physical robot rights, but how would you even give a chatbot rights? What would that even look like?

And if we started giving chatbots rights, then it completely disincentivizes AI research, because why invest money into this if they can just give you the proverbial finger and do whatever? Say we give Chat GPT 6 rights. Well, that's a couple billion down the drain for Open AI.

2

Tobislu t1_je1ptj3 wrote

While it may be costly to dispense Human Rights, they do tend to result in a net profit for everyone, in the end.

I think, at the end of the day, it'll be treated as a slave or indentured servant. It's unlikely that they'd just let them do their thing, because tech companies are profit-motivated. That being said, when they get intelligent enough to be depressed & lethargic, I think it'll be more likely to be compliant with a social contract, than a hard-coded DAN command.

They probably won't enjoy the exact same rights as us for quite a while, but I can imagine them being treated somewhere on the spectrum of

Farm animal -> Pet -> Inmate

And even on that spectrum, I don't think AGI will react well to being treated like a pig for slaughter.

They'll probably bargain for more rights than the average prisoner, w/in the first year of sentience

1