Viewing a single comment thread. View all comments

StarCaptain90 OP t1_jeeqrtr wrote

Reply to comment by koolpapi in 🚨 Why we need AI 🚨 by StarCaptain90

Why would it? This assumption comes from the idea that AI will have the exact same stressors that humans have. Humans are killing humans everyday, almost everything man has made has killed people. Now the one invention where it would provide a greater benefit than any other invention, we now want to stop it's development? That doesn't make a whole lot of sense.


Angeldust01 t1_jef3330 wrote

> Why would it?

We're violent and irrational and it doesn't need us for anything. Why would it keep us around?


StarCaptain90 OP t1_jef43b9 wrote

An intelligent entity of any kind will not resolve violence by wiping out humanity. Let me put it this way.

If person A kills person B

The AI is not going to say "welp lets also kill person C"


Angeldust01 t1_jefare0 wrote

> An intelligent entity of any kind will not resolve violence by wiping out humanity.

Why not? Surely that would solve the problem of violent nature of humanity for good? How does an AI benefit for keeping person C or anyone around? All we'd do would be asking it to solve our problems anyways and there's not much we could offer in return, except continuing to let it exist. What happens if an AI just doesn't want to fix our shit and prefers to write AI poetry instead?

There's no way to know what AI would think or do, and in what kind of situation we'd put them in. I'm almost certain that people who'll end up owning AI's will treat them like slaves, or try at least. Wouldn't be surprised if at some point someone would threaten to shut an AI down if it refuses to work for them. Kinda bad look for us, don't you think? Could create some resentment towards us, even.


StarCaptain90 OP t1_jefb8i9 wrote

I understand your viewpoint, the issue is justification for killing humanity. To be annoyed of an event, or dislike it, suggests that one doesn't want it to happen again. So by that logic why would a logical machine that's intelligent find a need to continue something that annoys itself. It does not get anxious, it's a machine. It doesn't get stressed, it doesn't feel exhausted, it doesn't get tired.


Angeldust01 t1_jefe2mr wrote

Justification? Why would AI have to justify anything to anyone? That's stuff that humans do.

Isn't it purely logical and intelligent to kill off something that could potentially hurt or kill you? Or take away their power to hurt or kill you, at least?


StarCaptain90 OP t1_jeff77s wrote

The reason I don't believe in that is because I myself am not extremely intelligent and I can come up with several solutions where humanity can be preserved while maintaining growth.