Viewing a single comment thread. View all comments

OriginalCompetitive t1_jeeq4b3 wrote

I think you’re missing how utterly alien any GAI will be to us. We have a single mind, closed off from direct contact with others.

But an AI mind will be able to split into thousands of separate copies, live independently, and then recombine (ie, by literally copying itself on multiple computers, severing the connections, and then reconnecting). Will that feel like being one mind, or a crowd of minds? Would a mind that is accustomed to creating copies and then shutting them down care about death?

Or consider the ability to store frozen copies of itself in storage files? What would that feel like? How would AGI think of that? What sort of “morality” would a being have that is constantly extinguishing copies of itself (killing them?) but itself never dies?

Would an AI that can store and revive itself across potentially decades or longer understand time? Would an AI that cannot physically move through the world understand the world? Would it live solely on the plane of abstract ideas, and never realize that a “real” world of space and time and humans with other minds even exists?

It’s absurd to wonder about the human morality of such an entity. It’s like asking if the sound of the wind has morality.

1