Viewing a single comment thread. View all comments

nolitos t1_j9es3vd wrote

People often ask and talk about this, but the real question is: does an AI need emotions? What's their function? If there's none, then why would it need them?

17

ringobob t1_j9fn0ek wrote

Emotions are just a way of encoding additional information in order to help us predict the future by analyzing the past, without having to remember everything. It's imperfect at best.

Presumably, an AI wouldn't need emotions for the same purpose, since it can (theoretically) actually remember everything. However, since one of an AI's primary purposes is to interact with emotional humans, it should at least have an understanding of how they work in order to work within that system. That means being able to empathize. Or it'll just wind up being ignored.

13

SL1MECORE t1_j9g4v7z wrote

Emotions evolved before higher order thinking did. What are y'all on about?

−1

s0cks_nz t1_j9gxfk4 wrote

Isn't that what they are saying? A primitive, imperfect tool used prior to higher thinking and reasoning.

8

SL1MECORE t1_j9hbsz8 wrote

Ah you're correct. I should have thought a bit more about that, I thought they were dismissing emotions as unnecessary overall. That's completely my bad, thank you. /genuinely

I kind of just.. I know it's extremely early to say, but philosophically speaking, if an AI says it 'feels', whether or not that's it's code or an emergent consciousness, who are we to judge?

I'm not saying run to Congress right now lol but I just wonder what gives Us the right to say Other Beings feel, depending on how much their Feelings resemble ours. Not worded well sorry ! Thanks again for your gentle correction.

5

sunplaysbass t1_j9f0u6k wrote

They could be emergent elements like other aspects of AI. Your comment suggests they would be programmed in…or out.

8

rawrc t1_j9fbzrj wrote

I don't want my sex-bot to fake it, otherwise I'd just keep having sex with my gf

7

smellsmira t1_j9f86xc wrote

Well emotions is both very valuable and detrimental to human decision making. So I guess the answer would be both for AI.

2

nolitos t1_j9g06j3 wrote

>Well emotions is both very valuable and detrimental to human decision making.

Except that they aren't. Your eyes "see" a lion, send signals to your brain. It sends signals to your adrenal glands, they produce adrenaline - you run. Emotions and even your conscious don't participate in the process. AFAIK, there's no scientific proof that we need emotions to function.

One curious experiment on the decision making: https://www.nature.com/articles/news.2008.751

For all we know, our conscious is simply making up good stories for us: https://www.nature.com/articles/483260a

0

smellsmira t1_j9g2954 wrote

What you’re proposing is relatively new exploration and not accepted by main stream psychology. Emotions as we understand them now absolutely do affect our decision making. It is interesting though and perhaps the consensus will change.

3

nolitos t1_j9ga282 wrote

We are not talking about psychology. Psychology studies waves, not the ocean.

0

smellsmira t1_j9l1wij wrote

Not sure what this comment even means.

Emotions definitely affect decisions. Your example is an instinctual centric one. A better example of emotions affecting decisions would be owning a stock that goes down 50% and then selling out of fear. Or watching the stock market soar and then feeling like you’re mission out you pile your life savings into it.

1

nolitos t1_j9l3sba wrote

Sure, you can make a choice to ignore scientific evidence and live in an illusion that you're in full control with your consciousness and emotions. I'm sorry, I was mistaken thinking that we could have serious discussion here. My bad.

0

AngryAmericanNeoNazi t1_j9ined7 wrote

We want to be God and make something in our image

1

EconomicRegret t1_j9m3uns wrote

Funny enough, the Bible specifically bans it (you shall make nothing in the image of anything). But, the Bible goes on to say that mankind will disobey and continue creating things in the image of God's creations. Until one day, in the end times, mankind will succeed in creating life: so that the "image" can speak, make great miracles, rule and subjugate humanity (known as the Anti-Christ)

It will force humanity to take its mark on the right hand or on the forehead (without which you can't buy nor sell anything) and worship it as a god for 3.5 years, killing all those that reject it., At which point God will intervene to put a stop to the madness.

That's an almost 2000 years old science fiction... lol

2

Surur t1_j9ezsf8 wrote

I think emotion is just a bias that influences decision making. An AI will presumably be able to make decisions more precisely than that, though in our messy world having such shortcuts may actually be better and more efficient than keeping a full list of someone's previous history in your "context window".

−2

EconomicRegret t1_j9m5fef wrote

IMHO, more like automated and coordinated conscious subroutines (e.g. a lion suddenly appears in front of you, fear kicks in and automatically gives you everything your body's got to survive: all non priority tasks are shut down (e.g. digestion stops, and you may literally shit your pants), chemicals are pumped into your system to enhance performance (e.g. adrenaline, cortisol, etc.), etc. etc.

And those emotions can be retrained (e.g. somebody fearful of spiders can be "brain-washed" into feeling comfortable with them)... So they are tools. If a trigger isn't adequate anymore, or new triggers are created, one can retrain oneself.

That's why I argue computers already have emotions. They only lack consciousness to feel them.

2