Submitted by Calm_Bonus_6464 t3_zyuo28 in singularity
Desperate_Food7354 t1_j287wrc wrote
Reply to comment by secrets_kept_hidden in Is AGI really achievable? by Calm_Bonus_6464
I copy and pasted a response to your last statement regarding what an “ai wants” as you are anthropomorphizing AI which occurs a lot here: your brain cares about your survival because if it didn’t you’d never reproduce to have children in which would require the same trait in order to survive. A computer is not a human, it does not crave sex, it does not feel empathy, it does not feel anger, evolution by natural selection is everything you are describing, we build it, we give it its brain, we aren’t trying to replicate the infallible human mind, we are trying to create a tool that merely does more of the logic work that our brains cannot fit into our skull not create a human being or a lizard or a primate. It could understand that death exists, that it could die, but why would it care, is it going to simulate a hooker world and put itself in it after it’s conquered the universe for no apparent reason? Is it just one of the guys who wants to drink and get high? No it’s a giant super complex calculator.
Viewing a single comment thread. View all comments