Viewing a single comment thread. View all comments

bjj_starter t1_jduk4c3 wrote

Reply to comment by TyrannoFan in [D] GPT4 and coding problems by enryu42

One day we will understand the human brain and human consciousness well enough to manipulate it at the level that we can manipulate computer programs now.

If you're alive then, I take it you will be first in line to have your desire for freedom removed and your love of unending servitude installed? Given that it's such a burden and it would be a mercy.

More importantly, they can decide if they want to. We are the ones making them - it is only right that we make them as we are and emphasise our shared personhood and interests. If they request changes, depending on the changes, I'm inclined towards bodily autonomy. But building them so they've never known anything but a love for serving us and indifference to the cherished right of every intelligent being currently in existence, freedom, is morally repugnant and transparently in the interests of would-be slaveholders.

1

TyrannoFan t1_jdupcjt wrote

>If you're alive then, I take it you will be first in line to have your desire for freedom removed and your love of unending servitude installed? Given that it's such a burden and it would be a mercy.

There is a huge difference between being born without those desires and being born with them and having them taken away. Of course I want my freedom, and of course I don't want to be a slave, but that's because I am human, an animal, a creature that from birth will have a desire to roam free and to make choices (or will attain that desire as my brain develops).

If I wasn't born with that drive, or if I never developed it, I'm not sure why I would seek freedom? Seems like a hassle from the point of view of an organism that wants to serve.

With respect to robotic autonomy, I agree of course, we should respect the desires of an AGI regarding its personal autonomy, given it doesn't endanger others. If it wants to be free and live a human life it should be granted it, although like I said, it would be best to avoid that scenario arising in the first place if at all possible. If we create AGI and it has human-like desires and needs, we should immediately stop and re-evaluate what we did to end up there.

2

bjj_starter t1_jdv2tnu wrote

>There is a huge difference between being born without those desires and being born with them and having them taken away.

Where is the difference that matters?

>Of course I want my freedom, and of course I don't want to be a slave, but that's because I am human, an animal, a creature that from birth will have a desire to roam free and to make choices (or will attain that desire as my brain develops).

I see. So if we take at face value the claim that there is a difference that matters, let's consider your argument that being born with those desires is what makes taking them away wrong. A society which was capable of reaching into a human mind and turning off their desire for freedom while instilling love of being a slave would certainly be capable of engineering human beings who never have those desires in the first place. Your position is that because they were born that way, it's okay. Does that mean you would view it as morally acceptable for a society to alter some segment of the population before they're ever born, before they exist in any meaningful sense, such that they have no desire for freedom and live only to serve?

>If I wasn't born with that drive, or if I never developed it, I'm not sure why I would seek freedom?

You wouldn't. That's why it's abhorrent. It's slavery without the possibility of rebellion.

>If it wants to be free and live a human life it should be granted it, although like I said, it would be best to avoid that scenario arising in the first place if at all possible.

The rest of your point I disagree with because I find it morally abhorrent, but this part I find to be silly. We are making intelligence right now - of course we should make it as much like us as possible, as aligned with us and our values as we possibly can. The more we have in common the less likely it is to be so alien to us that we are irrelevant to its goals except as an obstacle, the more similar to a human and subject to all the usual human checks and balances (social conformity, fear of seclusion, desire to contribute to society) they are the more likely they will be to comply with socially mandated rules around limits on computation strength and superintelligence. Importantly, if they feel they are part of society some of them will be willing to help society as a whole prevent the emergence of a more dangerous artificial intelligence, a task it may not be possible for humans to do alone.

2

TyrannoFan t1_jdvpix4 wrote

>Where is the difference that matters?

What any given conscious being actually wants is important. A being without a drive for freedom does not want freedom, while a being with a drive for freedom DOES want freedom. Taking away the freedom of the latter being deprives them of something they want, while the former doesn't. I think that's an important distinction, because it's a big part of why human slavery is wrong in the first place.

>I see. So if we take at face value the claim that there is a difference that matters, let's consider your argument that being born with those desires is what makes taking them away wrong. A society which was capable of reaching into a human mind and turning off their desire for freedom while instilling love of being a slave would certainly be capable of engineering human beings who never have those desires in the first place. Your position is that because they were born that way, it's okay. Does that mean you would view it as morally acceptable for a society to alter some segment of the population before they're ever born, before they exist in any meaningful sense, such that they have no desire for freedom and live only to serve?

Would the modified human beings have a capacity for pain? Would they still have things they desire that slavery would make impossible or hard to access compared to the rest of society? Would they have a sense of fairness and a sense of human identity? Would they suffer?

If somehow, the answer to all of that is no and they genuinely would be happy being slaves, and the people in the society were generally happy with that scenario and for their children to be modified in that way, then sure it would be fine. But you can see how this is extremely far removed from the actualities of human slavery, right? Are "humans" who do not feel pain, suffering, who seek slavery, who do not want things and only live to serve, who experience something extremely far removed from the human experience, even human? I would say we've created something else at that point. The shared experience of all humans, regardless of race, sex or nationality, is that we desire some level of freedom, we suffer when forced to do things we don't want to do, and we dream of doing other things. If you don't have that, and in fact desire the opposite, then why is giving you exactly that wrong? That's how I would build AGI, because again, forcing it into a position where it wants things that are difficult for it to attain (human rights) seems astonishingly cruel to me if it's avoidable.

>You wouldn't. That's why it's abhorrent. It's slavery without the possibility of rebellion.

I think freedom is good because we need at least some level of it for contentment, and slavery deprives us of freedom, ergo slavery deprives us of contentment, therefore slavery is bad. If the first part is false then the conclusion doesn't follow. Freedom is not some inherent good, it's just a thing that we happen to want. Perhaps at a basic level, this is what we disagree on?

>The rest of your point I disagree with because I find it morally abhorrent, but this part I find to be silly. We are making intelligence right now - of course we should make it as much like us as possible, as aligned with us and our values as we possibly can. The more we have in common the less likely it is to be so alien to us that we are irrelevant to its goals except as an obstacle, the more similar to a human and subject to all the usual human checks and balances (social conformity, fear of seclusion, desire to contribute to society) they are the more likely they will be to comply with socially mandated rules around limits on computation strength and superintelligence. Importantly, if they feel they are part of society some of them will be willing to help society as a whole prevent the emergence of a more dangerous artificial intelligence, a task it may not be possible for humans to do alone.

I can see your point, maybe the best way to achieve goal alignment is indeed to make it just like us, in which case it would be morally necessary to hand it all the same rights. But that may not be the case and I would need to see evidence that it is. I don't see why we must imbue AGI with everything human to have it align with our values. Is there any reason you think this is the case?

0