Viewing a single comment thread. View all comments

StarCaptain90 OP t1_jefb8i9 wrote

I understand your viewpoint, the issue is justification for killing humanity. To be annoyed of an event, or dislike it, suggests that one doesn't want it to happen again. So by that logic why would a logical machine that's intelligent find a need to continue something that annoys itself. It does not get anxious, it's a machine. It doesn't get stressed, it doesn't feel exhausted, it doesn't get tired.


Angeldust01 t1_jefe2mr wrote

Justification? Why would AI have to justify anything to anyone? That's stuff that humans do.

Isn't it purely logical and intelligent to kill off something that could potentially hurt or kill you? Or take away their power to hurt or kill you, at least?


StarCaptain90 OP t1_jeff77s wrote

The reason I don't believe in that is because I myself am not extremely intelligent and I can come up with several solutions where humanity can be preserved while maintaining growth.