Comments

You must log in or register to comment.

Anastariana t1_isv6xt6 wrote

We aren't even ready for tech thats lets us speak to each other. The internet has demonstrated this in spades.

90

Kabbz t1_isv7vsm wrote

I saw this black mirror episode and I heartily reject this whole premise

41

PeteTownsendPT t1_isv7bnu wrote

If I could, I’d sit down with my mother’s side grandparents… but only one last time. As an adult, to seek guidance, get closure on some sensitive family topics, but also to be able to say a proper, conscious, and consensual, goodbye.

I’d love for them to live forever, of course, but losing a loved one is also a moment of self reflection on where we are and where we want to go, in every dimension of our lives. We need that to grow leaps forward, it’s part of the human experience. Bringing loved ones back in an artificial form could be holding people back on that much needed growth, and cause more harm than good.

27

dorthyinwonder t1_isvcuut wrote

Oof. This. I lost all my grandparents in my mid-twenties. Most were expected and came as a relief to their suffering. My maternal grandmother's was sudden and raw because she fell during the night and passed away with my parents and myself right across the road, with my sister sleeping next door. That's the death I still haven't gotten over, despite the nearly 15 years that have passed. I wish I could talk to her again, ask her advice (even if I don't like it), and just be around her. I wish she could see where I'm at now, because I was a bit late to grow up and adult. I can see benefits of this technology, but I can also see how it could make it more difficult for people to process the loss.

2

Straxicus2 t1_isvf2wc wrote

I lost my mom last year and I agree 100%. I would never not be with her if I had the option.

2

bishopbackstab t1_isv8uu5 wrote

I'm a senior ai designer, the tech just isn't there not even considering any sort of training data that would be required to make the bot contextual. Unless you have thousands of pages of journal entries from the deceased to train the bot on, the bot would be no better than a gpt3 bot or basic chat bot leveraging conversational patterns, which will still result in a poor simulation.

19

dragonowl1990 t1_isva5l3 wrote

lol yea basically always one wrong step away from having very uncomfortable conversations with grandma.

4

Pokethebeard t1_isvgepd wrote

I don't need my deceased grandmother to nag me about not having kids. Thank you very much.

3

Earthling7228320321 t1_it04rno wrote

That's the beauty of advertising. It doesn't have to be good to sell it to a bunch of desperate mourning people. Just set a few generic demographics and let a chat bot do a cringe worthy job of wearing a dead persons face.

The important thing is that shareholders will make lots of money, and that's what capitalism is for.

On a side note, I'm worried that we are being far too cautious and stingy with the AI research. Thankfully china and the usa are dumping tons into rivaling each others AI development, so at least there's that.

2

i_only_eat_nachos t1_isv9yci wrote

I loved my mom, but I’d never want to hear her speak to me ever again regardless if it’s really her or some kind of AI. That would open up a wound that I have happily accepted as a scar for some ten years now.

7

Astranoth t1_isv6vkk wrote

This is not only weird but most likely will mess with our heads.

We have barely scratched the surface of psychology so I would strongly recommend not to fuck with these things

6

AdaptiveCenterpiece t1_isv9jod wrote

I wonder if I could record my wife and I every 5-10 years so my infant daughter could talk to different ages of us. That’d be interesting.

6

Sivalleydan2 t1_isvjowf wrote

Check out the film "My Life" with Michael Keaton. Good stuff.

2

NedRyerson_Insurance t1_isvcy6q wrote

"Dozens of dead sexy milfs in your area are waiting to talk to you."

Wait, 'dead-sexy' or 'dead, sexy'

6

huhIguess t1_iswty5z wrote

Get a discount by packaging it with digitized-pornstars who died 20 years ago.

2

Shot-Job-8841 t1_isv44jb wrote

This was already posted 12 hours ago, please don’t repost on the same day.

5

kiwilapple t1_isv82u8 wrote

Something something, the Domnhall Gleeson Black Mirror episode.

3

fafefifof t1_isvec40 wrote

The void they have left and how much I had to learn to find who I am without them is a large part of the newfound love I have for life. No thank you.

2

FuturologyBot t1_isv71uu wrote

The following submission statement was provided by /u/ChickenTeriyakiBoy1:


>From what I could glean over a dozen conversations with my virtually deceased parents, this really will make it easier to keep close the people we love. It’s not hard to see the appeal. People might turn to digital replicas for comfort, or to mark special milestones like anniversaries.
At the same time, the technology and the world it’s enabling are, unsurprisingly, imperfect, and the ethics of creating a virtual version of someone are complex, especially if that person hasn’t been able to provide consent.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/y7kxb6/technology_that_lets_us_speak_to_our_dead/isv2u9j/

1

SpoozeysmOkes t1_isv9qd2 wrote

I don’t think this is entirely smart. I can see so many ways this could drive us insane

1

mauhumor t1_isva2mi wrote

It's an amazing technology that really works, still under development though, you can talk but can't listen back just yet.

1

Static077 t1_isvfxf4 wrote

It's just a chat bot that gets info dumped and we see what it spits out. It's not an amazing technology, it's a technology that exists, and they're trying to exploit people's grief with it. The more I think about it, it's actually fucking gross

2

mauhumor t1_isy4436 wrote

Agree, was being sarcastic, altough using a bad analogy "machine to talk with dead, but cant hear answer just yet", as you said, its just people exploiting grief.

1

suck_my_waluweenie t1_isvavom wrote

I live life by a general rule that if it was on an episode of black mirror it will probably not be great

Somewhat related sidebar I hope everyone is ready for boston dynamics robot dogs to destroy humanity

1

onespicyorange t1_isvcu4j wrote

I am just so so tired of the tech industry. So tired

1

RKELEC t1_isvd2zu wrote

I would sit down with my parents and thank them for everything. Having 2 kids of my own has really made me appreciate the things they sacrificed for me and made me feel guilty about some of the stuff I put them through.

1

gazooontite t1_isvf7zk wrote

Well, the title is a lie. It’s just smoke and mirrors at the end of the day.

1

12kdaysinthefire t1_isvf9f3 wrote

No thank you, this seems like it could easily go down the wrong road for a lot of people.

1

echtesteirerin t1_isvfddd wrote

This is the dumbest shit I've read in a hot minute.

1

HistoryAndScience t1_isvfjy7 wrote

It’s not something that should exist. We need closure and to move on plus these aren’t real. Allowing us to relive things via an algorithm, in my opinion, prevents the ability to have emotional closure

1

Jazzlike_Crew_3956 t1_isvg8cv wrote

Nope. I think death is death and we shouldn't force people to exist after they are gone without their consent.

1

Prestigious-Emu7325 t1_isvhx7v wrote

I can see both sides. I had a very loving, but rocky relationship with my mother who died 8 months ago today. For the last 4 years of her life, I tried to make amends to her for my role in our trials, but that endeavor was greatly hindered by the dementia that was methodically stripping her of all her executive function. We were able to spend a lot of great moments together during that time, but I wish I could believe in my heart that she truly understood my many apologies. These are regrets I’ll have for the rest of my life. So I really do wonder if a technology like this would lend me the ability to shed that albatross.

1

SheltemDragon t1_isvnn9p wrote

No.

I mean, I would give almost anything to have my brother alive again by my side with the weight of the demons that drove him into the depths of alcohol addiction excised from his soul.

But to speak with a simulacrum? Or even the common "one more day" request. For the love of god, no. Losing them once almost broke me, and it's hard enough when their ghost shows up in my dreams or my memories when it's quiet at night. But to have to lose them *again* or become emotionally attached to a simulacrum under someone else's control? I'm not sure I could make it through it.

And that last point, I think, is the hardest and most dangerous. It is*not* the person, no matter how good the simulation is.

1

Fruitcrackers99 t1_isx8n5c wrote

Imagine having parents that you got along with well enough to want to talk to them after they’ve passed…

1

Apprehensive_Way870 t1_isxv6c6 wrote

Not for me. Letting go has been the hardest part of accepting the loss of my mother in 2018, just because it was so sudden, and I feel like such technology would unnecessarily prolong the grieving process or, in extreme cases, prevent it from playing out at all. Her health had been declining for a while, and you always think you have more time until that time runs out. I still deal with a lot of regret for not calling her more often because talking to her was so depressing and I have my own mental health problems, but speaking with a virtual replica of her would do nothing for me. It wouldn't be her, so what's the point? She's gone, and the only closure I have is knowing that she isn't living in misery anymore. Sometimes that's the best you can get.

1

Molnan t1_isyec7j wrote

The problem I see with this is that you are using a powerful, sophisticated technology to trick your brain into believing a lie, the lie that you are talking to your dead loved ones rather than to a bunch of chatbots made to superficially sound a bit like them based on a few questions, recordings and the like. This can only end in two ways: either you permanently erase the distinction between truth and lies and lose your mind, or you come back to reality with fresh grief and disappointment after each conversation.

Some point out that people often cling to old pictures, letters, personal belongings and video recordings, and these chatbots can be seen as an extension of that practice. Well, for starters, that should be done in moderation too, but more to the point, those things don't mess with your mind nearly as much because they don't try to simulate a fresh interaction.

1

Beneficial-Noise5524 t1_iszkzc5 wrote

In my opinion this is possible, if I’m not mistaken this type of technology will be established by the year 2030

1

Vakawada t1_isv7puj wrote

If it gives people comfort, why not? Better than religion.

0

forestwolf42 t1_isvb1qx wrote

People could also use this to keep their religious leaders around forever.

3

hotpietptwp t1_isvegdp wrote

I wonder if somebody has already snagged the WhatWouldTrumpSay domain.

2

ChickenTeriyakiBoy1 OP t1_isv2u9j wrote

>From what I could glean over a dozen conversations with my virtually deceased parents, this really will make it easier to keep close the people we love. It’s not hard to see the appeal. People might turn to digital replicas for comfort, or to mark special milestones like anniversaries.
At the same time, the technology and the world it’s enabling are, unsurprisingly, imperfect, and the ethics of creating a virtual version of someone are complex, especially if that person hasn’t been able to provide consent.

−4

ThirtyMileSniper t1_isv4j5n wrote

It's a bit sad. From what I read earlier its taken from a series of answers given by the user about the deceased. It's extrapolating from the memories given to it but the answers it gives that are outside its inputs are just fabrications. I really think that this could be a massively unhealthy thing for a great many people. Part of grieving and life is letting people go.

11

WhatLikeAPuma751 t1_isv6tyf wrote

This is my struggle as well. Death is part of the cycle, and the reason why life is so precious and beautiful.

3

bishopbackstab t1_isv92qd wrote

That's the real problem here, it's using very loose data points to construct a bot simulating a person you probably knew closely.

3