Submitted by [deleted] t3_yety91 in singularity
hducug t1_itzyj18 wrote
Reply to comment by sticky_symbols in Do you guys really think the AI won't just kill you? by [deleted]
The only way that can happen is when the ai gets a reward system for doing things right. But you can fix that by letting the ai study human emotions, study the human brain or the ai can’t actually do anything and only give commands on how to do stuff.
sticky_symbols t1_itzyv66 wrote
Maybe. Or maybe not. Even solving problems involves making goals, and humans seem to be terrible at information security. See the websites I mentioned in another comment for that discussion.
hducug t1_itzzwx3 wrote
I don’t think that an super intelligent ai will have a hard time understanding what our goals are, otherwise we would indeed be screwed.
sticky_symbols t1_iu00ftj wrote
See the post "the AI knows and doesn't care". I find it completely compelling on this topic.
hducug t1_iu00r3j wrote
Can you give me a link?
Viewing a single comment thread. View all comments