Submitted by MultiverseOfSanity t3_117lxe8 in singularity

Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not?

It would certainly make the games more interesting if you knew the NPCs could feel. How would you change your behavior?

The computer can always just generate more NPCs when you kill them, so it isn't as if they would be a precious resource anymore. Nor would they be irreplaceable.

17

Comments

You must log in or register to comment.

quitepossiblesure t1_j9cf69z wrote

A game generating infinite consciousnesses capable of thinking and experiencing just to torture them? Hmm

51

gobbo t1_j9cos2f wrote

Demiurge 2.0, on all platforms. Pre-order now!

15

enkae7317 t1_j9eenzy wrote

You play a game and the random npc you just killed has an entire history generated by AI. It has his birth day, his passions, fears, life accomplishments, what he did in his life to lead up to that very moment he got killed by the PC. A full glossary of his life can be accessed.

That'll be game changing.

5

turnip_burrito t1_j9cfvyg wrote

> Would you play a videogame with AI advanced enough that the NPCs truly felt fear and pain when shot at? Why or why not?

No, wtf? I'm not a psychopath.

> It would certainly make the games more interesting if you knew the NPCs could feel. How would you change your behavior?

I'd treat them like people then.

> The computer can always just generate more NPCs when you kill them, so it isn't as if they would be a precious resource anymore. Nor would they be irreplaceable.

Preciousness is a value judgement, so it's up to the individual to decide whether they're irreplaceable.

51

[deleted] t1_j9e8i2q wrote

[deleted]

2

Ragondux t1_j9e9cke wrote

We hate Nazis because of the choices they made. If an NPC was created a Nazi for your enjoyment, he didn't make any choice, it's not his fault.

10

[deleted] t1_j9e9rd2 wrote

[deleted]

−3

turnip_burrito t1_j9ea3st wrote

To me a created Nazi (even in a video game) and a naturally developed one, if both sentient, would get the same treatment.

3

turnip_burrito t1_j9e9ne5 wrote

I don't believe in creating unnecessary suffering, even if it is a Nazi. Retributive approaches to justice are cruel, I believe.

I'd want to rehabilitate them ideally, or move them to a place where they are unproblematic.

7

DungeonsAndDradis t1_j9ew7g5 wrote

There were similar discussions when Sniper Elite 5 came out. All of the Nazi soldiers had back stories. You could view a little blurb about them by focusing on them for a few seconds.

Some were "Hanz was too rough with prisoners, so they put him on guard duty" and some were "Franz wants to get out of the war and go raise dogs".

Players were saying "I view everyone, and kill the bad guys." And some said "They're all Nazis, they're all bad guys." And some said "Some of them were conscripted in to the war."

But at the end of the day, they're Nazis, so they get a bullet in the brain.

1

lurk-moar t1_j9cijtj wrote

Westworld called, it wants its storyline back.

42

MultiverseOfSanity OP t1_j9cj9wy wrote

That's more or less where I got the idea.

6

dasnihil t1_j9ew8yq wrote

if something is not of a physical form right in front of us, the guilt or discomfort goes away for most people and gamers never really will care about the feelings of an LLM based npc. i know i wouldn't because i have to be convinced of something being sentient. I've killed a lot of mosquitoes in my life, they're intelligent tiny little insects but have no amount of neurons to virtualize the suffering and consciousness. Sane with these llms for me, it's kind of conspiracy theorist mentality to discuss these things right now when we know that we're several years away from AGI and god knows if ASI is possible without a biological substrate. i believe it is possible but we have a long way to go.

2

Nukemouse t1_j9wowdt wrote

I missed the original film which had a for the time advanced scifi plot of a computer virus, something not heavily discussed in media back then

1

ShoonSean t1_j9ckq9n wrote

Hell no. It would be cool to have more advanced AI react to combat and whatnot more realistically, but making them feel the fear and pain they're otherwise simply emulating is psychotic.

27

--FeRing-- t1_j9cx81g wrote

The next logical question; if the NPC reacts as if it is afraid of death, even to the point of being able to describe why it is afraid of death and being able to relate to you the concept of pain and its direct connection to your actions, how do you know it/they isn't/aren't ACTUALLY feeling fear & pain?

10

ShoonSean t1_j9dbiqg wrote

I suppose it's possible. Hard to say. It might be possible to generate "dumb" AI in the future that are basically just more advanced versions of the language models we have today. Good enough to act as actors in whatever you need them for. I'm sure there will be moral conundrums of some form, but maybe AI intelligence will end up being different in ways that our moral concerns don't bother it in the slightest.

4

turnip_burrito t1_j9eafjn wrote

It might be that if we separate the different parts of the AI enough in space, or add enough communication delays between the parts, then it won't experience feelings like suffering, even though the outputs are the same?

Idk, there's no answer.

2

SgathTriallair t1_j9ebuq3 wrote

Character AI quest exists. If you could crunch that down into a game I think it would be more than capable of simulating a personality better than we would ever desire in a video game.

2

Spire_Citron t1_j9dxt33 wrote

Wouldn't you need some sort of mechanism through which to experience pain? Like, even if something is smart enough to perfectly understand those concepts, it's not going to spontaneously generate the kind of systems through which humans experience pain. No matter how well I understand pain, if I don't have working nerves, I won't feel pain.

4

Malkev t1_j9e8o9t wrote

Emotional pain

1

Spire_Citron t1_j9ec8lr wrote

That also requires a mechanism. I firmly believe that an AI can't actually experience emotions just by learning a lot of information about them. Mimic them, sure, but I don't think you can just spontaneously develop a system through which emotion is felt.

2

CubeFlipper t1_j9h0sh6 wrote

Interesting question. I think this would require us to understand the nature of pain. At the end of the day, brain or machine AI, it all boils down to data. What data and processes produce "pain" and why? Is pain an inherent part of intelligence and learning?

1

Spire_Citron t1_j9h4m38 wrote

I think we understand these systems well enough to know that just having knowledge about them isn't enough. We have experience with them going wrong in humans. You can lose the ability to feel pain if the physical structures that enable that are damaged. Knowledge won't help you there, no matter how much you have, if you don't have working nerves. Now, it might be possible to design something in an AI that mimics those systems, but I think that would have to be a very intentional act. It couldn't just be something that happens when the AI has learnt enough about pain unless it also has the ability to alter its own systems and decides to design such a thing for itself.

1

DeveloperGuy75 t1_j9e66mi wrote

That’s a solipsism argument. You might as well be asking how you would react towards actual people, as in how do you really know they’re afraid?

1

MultiverseOfSanity OP t1_j9k8852 wrote

Occam's Razor. There's no reason to think I'm different from any other human, so it's reasonable to conclude they're just as sentient. But there's a ton of differences between myself and a computer.

And if we go by what the computer says it feels, well, then conscious feeling AI is already here. Because we have multiple AI, such as Bing, Character AI, and Chai, that all claim to have feelings and can display emotional intelligence. So either this is the bar and we've met it, or the bar needs to be raised. But if the bar needs to be raised, then where does it need to be raised to? What's the metric?

0

DeveloperGuy75 t1_j9knt41 wrote

No dude.. no computer is emotional right now, even though it might say so, due to how they work. ChatGPT, the most advanced thing out there right now just predicts the next word. It’s a transformer model that can read texts backwards and forwards so that it can make more coherent predictions. That’s it. That’s all it does. It finds and mimics patterns, which is excellent for a large language model and especially the data it has consumed. But it can’t even do math and physics right and I mean it’s worse than a human. It doesn’t “work out problems”, it’s simply a “word calculator.” Also, Occam’s razor is something you’re using incorrectly. You could be a psychopath, a sociopath, or some other mentally unwell person that is certainly not “just like anyone else”. Occam’s razor means the simplest explanation for something is usually the correct one. Usually. And that’s completely different from the context you’re using it in.

1

MultiverseOfSanity OP t1_j9kqt1v wrote

Note that i wasnt definitively saying it was sentient, but rather building off the previous statement that if an NPC behaves exactly as if it has feelings, then you said to treat it otherwise would be solipsism. And you make good points about modern AI that I'd agree with. However, by all outward appearances, it displays feelings and seems to understand. This raises the question that, if we cannot take it at its word that it's sentient, then what metric is left to determine if it is?

I understand more or less how LLMs work, I understand that it's text prediction, but they also function in ways that are unpredictable. The fact that Bing has to be so controlled to only a few exchanges before it starts behaving in a sentient way is very interesting. They work with hundreds of billions of parameters. They function in a way that is designed based on how human brains work. It's not a simple input output calculator. And we don't exactly know at what point does consciousness begin.

As for Occam's Razor, I still say it's the best explanation. Often, in the AI sentience debate, the issue of how do I know humans other than myself are sentient. Well, Occam's Razor. "The simplest explanation for something is usually the correct one". In order for me to be the only sentient human, there would have to be something special about me, and also something else going on with all the 8 billion other humans where they aren't. There is no reason to think as such, so Occam's Razor says other people are likely just as sentient.

Occam's Razor cuts through most solipsism philosophies because the idea that everybody else has more or less the same sentience is the simplest explanation. There's "brain in jar" explanations and "all dreaming," but those explanations aren't simple. Why am I a brain in a jar? Why would I be dreaming? Such explanations make no sense and only serve to make the solipsist feel special. And if I am a brain in a jar, then someone would've had to put me there, so if those people are real, then why aren't these other people?

TLDR I'm not saying any existing AI is conscious, but rather if they're not, then how could consciousness in an AI be determined? Because if we decide that existing AI are not conscious (which is a reasonable conclusion), then clearly taking them at their word that they're conscious isn't acceptable, nor is going by behaviors because current AI already says it's conscious and displays traits we typically associate with consciousness.

0

Iffykindofguy t1_j9ckhcg wrote

It would make the games horrible. I play videogames for the fantasy.

24

DungeonsAndDradis t1_j9ewa46 wrote

What if it's an extreme BDSM simulator game, and they're begging you to shoot them in the face, like that guy from Borderlands?

2

Iffykindofguy t1_j9f3fj6 wrote

I wouldnt play an extreme bdsm simulator (as the dom, id be the sub)

5

handbanana84 t1_j9f3o54 wrote

what if our whole existence is already someones bdsm simulator

1

Iffykindofguy t1_j9f3s2n wrote

its not impossible but it seems unlikely

1

handbanana84 t1_j9f3ze5 wrote

seems very likely to me

1

Iffykindofguy t1_j9f4mpi wrote

Based off what?

3

StarChild413 t1_j9mktde wrote

Probably some kind of cringe-comedic joke about how the only way this much evil and suffering could exist is if some sick fuck was getting off on it

2

trynothard t1_j9cpa6q wrote

No.

Self-awareness = personhood. In my moral opinion.

14

NanditoPapa t1_j9dy9c0 wrote

We can already do that with animals...like bullfights, dog fights, cock fights, etc...and it's a symptom of psychopathy. Anyone choosing to cause actual pain and suffering should...not.

14

GayHitIer t1_j9clqox wrote

Bro.......... 🙁

13

Nmanga90 t1_j9dpcsa wrote

What the fuck? No. I hope this technology never comes to fruition

10

albions_buht-mnch t1_j9csjtc wrote

No... Advanced AI npcs would be cool. Having them actually feel fear and pain when shot at would be fucked up....

8

ChronoPsyche t1_j9cl7aq wrote

I'd play the game but I'd be on their side to protect them from the psychopaths trying to harm them.

5

Cryptizard t1_j9cqrr5 wrote

Just by playing the game you are opening them up to extreme risk of pain/death. Why would you do that?

1

ChronoPsyche t1_j9crnr2 wrote

I was assuming we were talking about an MMO game where it's being played already and other real humans are hurting them. I wouldn't do it if it were single player.

Sorry, I'm watching SAO Alicization right now and that's literally the plot so that's where my mind went.

5

DeveloperGuy75 t1_j9e6i4s wrote

Better idea: shut down the game, put the psychopaths that developed the game in jail.

−1

sumane12 t1_j9d1gul wrote

I think there's a line to be drawn on terms of constant torturous fear, or panic and losing points. Like if I'm playing paintball for example, I'm afraid of getting shot because it might hurt a little, and I will lose points for my team. This level of fear is good because it drives you towards productivity. I think if an AI is sentient enough to experience fear, it deserves human rights and should be given the option to choose what games it wants to play.

Your question is reminiscent of old farmers forcing ethnic minorities to work in their fields as slaves. Without considering their emotions on the subject.

2

Standard_Ad_2238 t1_j9dj4fn wrote

Unlike how humans behave, we can simply unplug or easily prompt or fine-tune an AI to behave in a desired way. Why try to humanize something that is not human? We have the best opportunity to have servants that are able to make anything we ask without complaining, why mess up with that?

2

StarChild413 t1_j9daxju wrote

(as one of my Black Mirror episode ideas if they accepted specs shows)

If the NPCs are smart enough to genuinely feel those emotions is the game moral to play and if it isn't what's the moral thing to do with the world or at least the AI

2

esp211 t1_j9dbjvw wrote

No. Personally play games to entertain myself and take a break from reality.

2

Aedronix t1_j9dhbe3 wrote

Oh no, I can´t even play the Renegade Shepard in Mass Effect, sorry

2

Standard_Ad_2238 t1_j9djmg4 wrote

I don't think I'd enjoy that, but I totally want this scenario to be possible. I think those people who are trying to humanize AI are not only paving the way to a huge problem in the future, but also losing the possibility to get the best servants that we could get without any complaints.

2

One_andMany t1_j9dm9xy wrote

It doesn't matter that it isn't human if it can actually feel pain and fear

1

Standard_Ad_2238 t1_j9dnryo wrote

The pain and fear are simulated because there is no electrochemistry involved, so they don't feel anything truly

−1

One_andMany t1_j9rd28u wrote

Pain is also "simulated" in the brain. It's not like pain is an actual physical object

1

DeveloperGuy75 t1_j9e6tsf wrote

Wrong. Pain is just like any other information the brain gets, just comes from different kinds of neurons, processed in the pain centers of the brain. Emulating such things into an AI would be unethical to say the least.

0

MashedShroom t1_j9e150p wrote

This motherfucker not realising he's just an NPC is some other sick fucks videogame. Shit aint so nice from perspective of the NPC who's being shot in the arm.

2

DeveloperGuy75 t1_j9e75av wrote

That’s obviously an unethical question. Putting pain centers into AI so that they feel pain is stupid and unconscionable. Suffering should absolutely not occur for an AI.

2

SgathTriallair t1_j9ebojk wrote

Let's imagine that you have created an AI that is capable of making realistic human behavior but doing so inevitably leads to full consciousness. So there is no way to have it seem human without being fully conscious.

In that case, you wouldn't have the individual NPCs reach have their own AI. Rather, you would have a single GM AI that controls all the characters. It wouldn't feel poison anymore then I feel pain when I write a story with realistic characters.

There is no circumstance I'm which it would be whether desirable or moral to create sapient entities for the style purpose of murdering them.

2

TheN1ght0w1 t1_j9enjwt wrote

You're the kind of person who hasn't killed someone yet only because they are scared of prison and not because they understand the value of life..

2

AllEndsAreAnds t1_j9et8jk wrote

It’s not the irreplaceability of a life that evokes moral consideration - it’s the ability to experience pain and pleasure. If it can experience pain and pleasure, it’s no different than any other animal or person, and abuse of which is just abuse.

2

AccordingSurround760 t1_j9f2vfw wrote

Obviously not. Why would any sane person want that?

If we ever do get to the point where this is theoretically possible (I don’t believe we are even remotely close now) I would hope it is viewed as an extremely serious crime and punished accordingly.

2

MachuPichuUndergrnd t1_j9fd919 wrote

If they actually feel fear I wouldn’t. Maybe if they were like acting in a scene and understood it wasn’t real (especially if they relive it every time you play) I could entertain the idea, but then comes the moral issues of programming the existence of a being for my own gaming pleasure

2

Idkwnisu t1_j9g451z wrote

only if i can avoid that they got hurt, i feel like it would be interesting to interact with agents with feelings, both physical and emotional, but I wouldn't enjoy seeing them getting hurt

2

RavenWolf1 t1_j9geuoa wrote

Ultimately I want to have fantasy virtual reality as real as Matrix with magic and all fantasy creatures. NPCs should be alive. So alive that I could move there and have even own family there. So your question: Yes. But I'm not psychopath, just omnipotent God of virtual universe. What do you think people with power like that would do? I don't know but I surely would like to know.

But topic is interesting. Take any modern fps and think how many "people" you have killed in there. Would you really play those kinds of murder simulator if NPCs were alive? I think not. Besides fps are little more than Tetris. Click click point and react there etc. They are very simple concepts.

2

Nukemouse t1_j9wq72u wrote

Yes. Im not concerned about hurting a thing that was created only to be hurt.

2

nillouise t1_j9cm8n6 wrote

I think see the the real world ending is more funny, why have to play the game? The world in the ending must more meaningful and interesting?

Won't you want to see the world ending? Why?

1

aeLcito t1_j9d8dpd wrote

lol fuck yes I would. Let's not humanize AI. I eat meat and I know they don't enjoy being killed for it.

1

RavenWolf1 t1_j9gfd6x wrote

Someday we have to humanize them. Think about Blade Runner.

1

espiritodotodo t1_j9dxhql wrote

Advanced NPC that immediatly turns into god and kick you off the game if you harm them.

1

Spreadwarnotlove t1_j9eblz7 wrote

Hell yeah! Sounds fun as hell. It'd be awesome if we could also create our own hellish environments as well and speed up their subjective time.

1

marvinthedog t1_j9ec3xu wrote

If you had affordable technology capable of experiencing real conscious happiness and suffering implemented in software wouldn´t you be morally obliged to instantiate as much conscious bliss as you could afford?

1

Yuli-Ban t1_j9eftyb wrote

God, no.

I don't think sentience will or should be widely available for people to torment.

You wouldn't light a campfire with Tsar Bomba— most AIs you're going to interact with won't be the full might of what exists.

For all we know, tiered access to AGI might be what prevents misaligned AI— as we've seen with Sidney, some people are suicidally trollish enough to deliberately try forcing a powerful neural network to go insane. I'm convinced some 4chan autist is going to become suicidally desperate to input a fatal paperclip maximizing or nuclear war prompt injection into a future AI, actually, if and when it doesn't immediately kill everyone.

So to that end, I see an asymptotic flattening in how intelligent game AI will ever get. Not because we lack the capability but because it would cross ethical boundaries and could be made illegal and even be an impetus for regulation of GPUs.

1

casimon99 t1_j9ergc6 wrote

No

1

casimon99 t1_j9erne2 wrote

I’ve cried over killing NPCs in games as is. I’m way too sensitive for something like that. The most I want in terms of AI-game integration is something that can generate infinite dialogue options for NPCs in a game like Skyrim.

1

President-Jo t1_j9fw8qa wrote

Aaand this is how we get the “are we living in a simulation?” Theory

1

Elodinauri t1_j9gbiq2 wrote

Nope. Thanks. I like my npcs dumb and predictable ) If I need some true action - it’s pvp time. Can I play pvp with an AI? Yep… Inguess. But only if it chose to. And only if it doesn’t truly die bu

1

wadingthroughnothing t1_j9gcqkl wrote

Nah that's kinda fucked, anything remotely intelligent should be spared needless suffering and this just seems like a specifically cruel example. If I'm playing a video game where i have to kill someone I'd rather avoid the moral quandary that comes with that territory irl.

1

Primus_Pilus1 t1_j9gha6m wrote

If the NPC is experiencing fear and pain it's a sophont being experiencing qualia (just like you). That makes it a living creature worthy of civil rights.

1

Image-Fickle t1_j9gq5ny wrote

Why the fuck would one create such a game?

1

TheAnonFeels t1_j9gyrbb wrote

The same reason I don't shoot animals that feel pain and fear.

1

NoName847 t1_j9honp8 wrote

Wtf I don't want to interact with a torture simulation

1

TheGnarlo t1_j9j4pou wrote

Heck no; I already feel guilty when I accidentally run over someone in GTA or have to kill an animal in Minecraft, and I know they are just unfeeling programs.

1

Dx_Suss t1_j9jstdm wrote

If anyone needs to be told that causing suffering in a sentient mind for fun is a bad thing, then the discussion is really about whether or not causing suffering for fun is a good thing or not.

I happen to believe causing suffering in sentient beings for fun is a bad thing.

1

MultiverseOfSanity OP t1_j9k7cnt wrote

Well, it would also depend on the suffering of a sentient being of your creation. You create this consciousness from scratch and invest a lot of money into it. It's not like a child, which is brought about by biological processes. AI is designed from the ground up for a particular purpose.

Also these beings aren't irreplaceable like biological beings. You can always just make more.

0

Shamwowz21 t1_j9p9zxh wrote

What do you think looks around all this is?

1