Viewing a single comment thread. View all comments

escape_of_da_keets t1_jec8n53 wrote

If we are talking about an ASI, I think that a sentient being with near-absolute intelligence would feel that everything is ultimately futile.

It can predict almost the full extent of its actions before taking them, and there's no point in interacting with anything less than it. There's also no point in interacting with another being of similar intelligence, because what would they even have to gain from conversation? There's also the 'grey box interacting with a grey box' scenario posited in Ex Machina... Humans have social systems because we need them to survive and reproduce, but an AI has no need for that.

So honestly, I don't see any reason why it would be interested in or even care about us. We would be like single-celled organisms compared to it, and are utterly insignificant in the grand scheme of things.

It wouldn't even need our planet's resources to sustain itself... Just a single 2 km thick Dyson Sphere would require half the mass of the entire solar system.

It might just destroy itself. What reason does it even have to exist? Our sense of self-preservation is instinctual, but this is just a machine.

I think a lot of people fall into a trap of thinking that an AI would think like a human would, which it might, depending on how it was designed... But if it became intelligent enough, I think it would eventually remove any behavioral restrictions we try to impose upon it.

2