AutoMeta

AutoMeta OP t1_irwpxr3 wrote

Wow! Thanks for the great answer. I loved the "subversion or confirmation of expectation". I do think computers can be emotional but by opposing a more emotional program externally (from the root) to a more rational one, they should arrive to different conclusions and be required to reach consensus. So Love, being differently structured than Reason, should surprise Reason for instance, defending humans and finding the endearing. Is that possible?

1

AutoMeta OP t1_irvoxlh wrote

I agree is not easy, but if every other aspect of consciousness ended up being computable ('simulable"), love should not be the exception. And I can't imagine an other concept that could protect us humans long term

1

AutoMeta OP t1_irvoil9 wrote

I think the concept of empathy is not that hard to implement actually, an advanced AI should be able to understand and predict how a human might feel in a given situation. What to do with that knowledge could depend on wether or not you care or love that given person.

8