Viewing a single comment thread. View all comments

Lawjarp2 t1_it8f2us wrote

Everything is derived from survival needs.

(1) Pain is the most useless. You can have sensors akin to the same without overloading your brain. I would say it's better to handle it logically.

(2) Fear is nearly the same but much worse leading to multiple fails like anxiety, PTSD, trauma. It's a good to have when you are limited by brain power and need to focus. Not needed for a super intelligent being.

(3) Anger, rage, vengeance are simply animal behaviours intended to survive and thrive in a society filled with other less intelligent but useful(food) beings

(4) Sadness, sense of fairness, empathy are necessary for social living in a group of biological beings. They don't need to exist outside them.

Need for survival is the only thing truly needed. Everything else gets created around it.

2

Kawawaymog t1_it8htvd wrote

Emotions are just our programmed behaviour, an AI could have any of them, non of them, or all of them, or completely different ones. My point is that they remain important for an AI if it is to be autonomous, to have autonomy is to have desires and person goals. Without any innate drives or motives an AI would have no reason to do anything.

1

Lawjarp2 t1_it8iz75 wrote

Yes I agree with that. But my point is they won't be anything like what we have. And only one is absolutely needed for everything else to come up. Survival.

Simulation of an entire universe is a terrible way to experience anything. Most people explain away the need for it through things that are very 'human' and don't consider how it's not essential

1

Kawawaymog t1_it8lp7a wrote

Well for one thing if this is a simulation there is no reason for the whole universe to actually be simulated, only the parts we are looking at. Even modern game designers are able to get around that one.

It will have as emotions whatever it is designed to have, that is whatever it’s creators build it to have, or whatever it evolves to have. It will be presumable be possible for it to change it’s own core programming. But it seems to me that it is unlikely it would desire to cut out parts of it’s ‘self’. We don’t need our limbic system, but we still want it. I think saying that a super intelligent AI wouldn’t have things it doesn’t need is very short sighted.

In the infinity of time that a super AI would have available to it I think its a reasonable suggestion that at some point it would simulate just about everything that it is possible to simulate. You have to remember that such a being would be around for trillions of trillions of years. It would possible for it to change it’s perception of time, such that, from it’s own experience, it is as close to eternal as can be imagined. What else is there to do but run though all the possible things that could be?

1