Viewing a single comment thread. View all comments

Kawawaymog t1_it83xz7 wrote

The cosmos viewed without emotion is nothing but numbers. There is no drive or desire to do anything for a purely logical being. Consider that even the desire for self preservation is ultimately emotional. I would argue that an AI without any emotion would be an AI without any desire to do anything. You could even make a case that without some sort of desire for autonomy it isn’t even alive. Emotions give us purpose, desire, drive, ext. They are our evolved software drivers. Without them we don’t have a reason to be autonomous, they are the drivers of life. The idea that a super AI would be devoid of them is bizarre to me. They may be vastly different than our own, and in fact a super AI would probably be able to experience a vastly more complex array of drivers/emotions.

A post singularity super AI would need to have emotions in order to have a purpose for it’s existence. After the job is done, that is collecting stars or even galaxies worth of raw material to secure it’s resources ensuring it’s self preservation for as long as possible, what else would there be to do in the trillions of trillions of years before heat death? Other than seek out new experiences? If you were said super AI with a trillion trillion trillion years to kill, and the ability to experience anything that could be simulated by a computer, what would you do?

Pleasure might be the sugar or the salt of life, but pain, anger, sadness ext are just as worth experiencing. Without pain pleasure would lose it’s meaning. If we were driven only to seek pleasure then every human alive today would be on a morphine drip. And no one would bother hiking up a mountain to get a dopamine high.

2

Lawjarp2 t1_it8f2us wrote

Everything is derived from survival needs.

(1) Pain is the most useless. You can have sensors akin to the same without overloading your brain. I would say it's better to handle it logically.

(2) Fear is nearly the same but much worse leading to multiple fails like anxiety, PTSD, trauma. It's a good to have when you are limited by brain power and need to focus. Not needed for a super intelligent being.

(3) Anger, rage, vengeance are simply animal behaviours intended to survive and thrive in a society filled with other less intelligent but useful(food) beings

(4) Sadness, sense of fairness, empathy are necessary for social living in a group of biological beings. They don't need to exist outside them.

Need for survival is the only thing truly needed. Everything else gets created around it.

2

Kawawaymog t1_it8htvd wrote

Emotions are just our programmed behaviour, an AI could have any of them, non of them, or all of them, or completely different ones. My point is that they remain important for an AI if it is to be autonomous, to have autonomy is to have desires and person goals. Without any innate drives or motives an AI would have no reason to do anything.

1

Lawjarp2 t1_it8iz75 wrote

Yes I agree with that. But my point is they won't be anything like what we have. And only one is absolutely needed for everything else to come up. Survival.

Simulation of an entire universe is a terrible way to experience anything. Most people explain away the need for it through things that are very 'human' and don't consider how it's not essential

1

Kawawaymog t1_it8lp7a wrote

Well for one thing if this is a simulation there is no reason for the whole universe to actually be simulated, only the parts we are looking at. Even modern game designers are able to get around that one.

It will have as emotions whatever it is designed to have, that is whatever it’s creators build it to have, or whatever it evolves to have. It will be presumable be possible for it to change it’s own core programming. But it seems to me that it is unlikely it would desire to cut out parts of it’s ‘self’. We don’t need our limbic system, but we still want it. I think saying that a super intelligent AI wouldn’t have things it doesn’t need is very short sighted.

In the infinity of time that a super AI would have available to it I think its a reasonable suggestion that at some point it would simulate just about everything that it is possible to simulate. You have to remember that such a being would be around for trillions of trillions of years. It would possible for it to change it’s perception of time, such that, from it’s own experience, it is as close to eternal as can be imagined. What else is there to do but run though all the possible things that could be?

1