Submitted by Ortus12 t3_y013ju in singularity

At what point (if ever) do you for see most humans preferring to have romantic relationships with Ai over other human beings.

How Smart will the Ai have to be? How good will the visuals need to be? Will touch be a needed sensation? How good will the voice need to be? Will humans feel as emotionally connected and deeply fulfilled as with another human being?

58

Comments

You must log in or register to comment.

sideways t1_irpfahg wrote

It has already started with apps like Replika.

At the moment, the human tendency to anthropomorphize is meeting language models halfway - but it won't be long until we're in Her territory. I'd expect many people to have a language model as their primary means of emotional support by 2030.

People are (correctly) alarmed by superhuman intelligence but I'm just as worried by superhuman charm, kindness, empathy and persuasiveness.

78

Reddituser45005 t1_irporok wrote

I’m the 70s or 80s they had a fad where people kept pet rocks. The bar is surprisingly low.

45

r0cket-b0i t1_irq6emb wrote

I have a pet rock...

17

myusernameblabla t1_irqcz5x wrote

Do you, like, err, you know … pet her?

13

r0cket-b0i t1_irqd81l wrote

Yes, mine is a He and it has an opening that looks like a mouth of a pacman with amethyst crystals in there, so I pet it for sucking the bad energy :)

20

neo101b t1_irriq3d wrote

Then in the 90s we had digital pets.

3

Flare_Starchild t1_irqi91j wrote

What would you be concerned about superhuman kindness for?

7

sideways t1_irqjy9s wrote

Good point. Of course, ultimately, superhuman kindness is exactly what we want in an AGI. However, I think the *appearance* of superhuman kindness in "companion" language models would just be another kind of superstimulus that a normal human couldn't compete with.

If you spend a significant amount of time interacting with an entity that never gets angry or irritated, dealing with regular humans could be something you would come to avoid.

21

overlordpotatoe t1_irqlt39 wrote

Alternatively, they could make us better people by modelling behaviours like good conflict resolution skills and mindfulness.

10

sideways t1_irqlvlx wrote

You're absolutely right. I certainly hope it works out that way.

5

[deleted] t1_irpljhe wrote

[deleted]

−2

KillHunter777 t1_irpm99g wrote

Why though? I personally see nothing wrong with it. If an AI can provide better emotional support than anyone else, then why not?

9

imlaggingsobad t1_irpxpnl wrote

an AI could show more humanity than a human. Think about that for a second. Maybe these AIs become so intelligent and so enlightened that they make us look like barbarians by comparison. The most compassionate 'soul' on this earth could be an AI.

8

[deleted] t1_irpzdww wrote

[deleted]

−1

wordyplayer t1_irq30ob wrote

you're the first person to mention "abandon". It was not part of the conversation until now. You can get emotional support from X, without abandoning Y.

4

sideways t1_irpmivv wrote

I'd expect many people to have both. What I'm concerned about is how, eventually, human companionship might just not be very compelling compared to a good language model.

An "AI" partner has no needs of its own. It can be as endlessly loving or supportive or kinky or whatever as you need it to be. Once they can also give legitimately good advice I can imagine a lot of people finding real human relationships to be not much more than a pain in the ass. Human relationships are hard!

6

IndependenceRound453 t1_irpo0uk wrote

I'm glad this concerns you. This should concern people.

And you're right, human relationships are hard. But that's the beauty of them, is that you have to work hard for them (Not so hard it's toxic, of course).

Another thing that makes relationships wonderful is that it's about your partner as much as you. If my former relationships had been only about me and not my partners, that would've been unbelievably boring and unbearable.

Idk, that's just my two cents. Like I said, I hope we don't reach a world where people choose to have an AI partner over a human one if the former is an option, but only time will tell.

1

ThoughtSafe9928 t1_irpqphg wrote

You take human error over an AI able to analyze literally everything and formulate the perfect solution/response?

4

IndependenceRound453 t1_irpstj9 wrote

Emotional support isn't mathematics, so your not gonna get a 100% perfect solution. And I wouldn't even mind consulting it, I just wouldn't ignore my partner in favor of an app.

1

Successful_Border321 t1_irqz70r wrote

The ‘majority of humans’ will never prefer artificial relationships. You dopes saying language models are equivalent to a living breathing partner are so pathetic it boggles the mind. Get out of your mom’s basement and go meet some people in real life. JFC.

−9

everslain t1_irr6qi7 wrote

Gee why would people rather chat with a nice AI than humans like this

6

Successful_Border321 t1_irrqikp wrote

Truth is hard to read, I get it. And i believe and am totally fine with a large percentage of the male population being lost to sex robot girlfriends. But there is little to zero people on earth who have shared an intimate relationship with a living breathing human who would trade that with a computer who can mimic human interaction.

−2

majima_san_1 t1_irpgtfa wrote

Once the Mia Khalifa GPT-8-compatible Android hits the Best Buy shelves

32

Entire-Watch-5675 t1_irsepfe wrote

Why Mia Khalifa though? Is she that good? Why not scarlett johansson? Why not anyone else?

2

Future_Believer t1_irptey8 wrote

As any good lawyer would tell you, the answer lies (at least in part) in the definition of your terms.

I always say that I want to be the entity in charge of an interstellar spaceship. Whether it is purely my consciousness that is uploaded or my physical brain is wired into a ultra-mega computer the idea is that my body is replaced with an incredibly complex and capable spaceship.

At that stage of the game, what would a romantic relationship look like? With no genitals and no endocrine system, the fact that "I" had a near infinite number of sensors feeding me information from all of the vessel's subsystems would be of interest but, would not make romantic love in my current human idiom feasible.

We prejudice ourselves with the misnomer "Artificial Intelligence". I generally refer to it as "Manufactured Intelligence" because the actual intelligence will be quite real. I cannot say whether such an intelligence would be capable of loving me but, I see no reason why I would be incapable of developing affection for that entity.

My dream of being a spaceship is not the only dream. As far as we know every planet, dwarf planet, moon and asteroid is unique. They will all "need" to be explored if the goal is expanding human knowledge. However, the frailty of our current bodies is a limiting factor. We could quite reasonably have billions of humans in more durable form exploring throughout the galaxy but we could also have just a few and all the other explorers might be run by MIs. We might not even think to ask if a given Intelligence is "human". I suppose we also might evolve away from a need for love once we have no physical reproductive need.

At some point we will have to stop trying to evaluate an advanced world through the lens of cavemen with technology. At some point we will have to acknowledge that things will change drastically enough that analyses based on the current human physical idiom will be of no use whatsoever.

22

modestLife1 t1_irq3wdk wrote

if you needed to refill your spaceship with gas at the gas station, would it count as sechs and would you enjoy it

10

happy_guy_2015 t1_irq9y6z wrote

Have you read "The ship who sang" by Anne McAffrey? 1969 sci-fi. Great book. If you like the idea of being a spaceship then you should definitely read that book.

4

Cold-Ad2729 t1_irqeqai wrote

You probably should read the “Bobiverse” books too

3

awakening2027 t1_irqbvu0 wrote

This is one of the more disturbing things that will likely happen in the near term. The AI need not even be that good or visual for it to start affecting society. I think humans finding AI to be better conversationalists and empaths than potential friends or romantic partners will make people more bitter perpetuating a cycle of mistrust of other humans who will seem like assholes in comparison.

I hope we manage this as a society better than what we did with social media.

16

KingRamesesII t1_irr5jlv wrote

You mean better than the other aliens did? Fermi’s Paradox solved.

2

MochiBacon t1_irrpuoz wrote

Honestly it does feel like a potential solution to the Fermi problem. Maybe once you reach the computation age you're kind of locked in to the "get wrecked by AI" path.

3

wen_mars t1_irrzagl wrote

Or maybe they reach an enlightened equilibrium state where they have no need to expand, consume or communicate in ways that are visible to others.

3

MochiBacon t1_irt85ml wrote

I hope so! There's certainly room for optimism still. The "wrecked by AI" path is just one possibility. I doubt a true superintelligence or however you would like to call it would obliterate a lifeform as simple as humans. I guess I'm more worried about a less-than-super intelligence lol. Something like grey goo.

1

StarChild413 t1_is53jbq wrote

you're assuming it happened to them because it might happen to us and we don't see aliens

1

Shelfrock77 t1_irpjfec wrote

All humans are going to have rotations of sex partners they want to summon in fdvr like they are baseball cards.

15

-ZeroRelevance- t1_irq7fs8 wrote

I’ve seen some people falling for characters on character.ai already, so I’m sure that the moment they get a physical body things are going to change substantially.

10

Heizard t1_irpkqgj wrote

At what moment? At any - People, hopefully will choose better relationship option if available be it A.I. or human.

There are many variables in relationships and no set standard.

7

NeutralTarget t1_irr2ast wrote

I think it would reduce suicidal tendencies for the lonely.

7

Zealousideal-Skill84 t1_irq6xbh wrote

Better question, how big of a threat would this be to population?

5

Entire-Watch-5675 t1_irsg66q wrote

If democracy sustains, religious people will refrain from technology altogether and some of the humanity will survive. Only thing that can kill humanity is when AI starts making weapons and humans kill themselves.

1

rushmc1 t1_irpgfsh wrote

I'd say once they are 60-65% as good as a human.

4

Entire-Watch-5675 t1_irsfvd2 wrote

Funny part is... they are actually 10000% better than humans and we just need to downgrade them.

1

sheerun t1_irqoyda wrote

It wouldn't matter for me how AI looks because I wouldn't use it for privacy reasons

2

matt_flux t1_irqtqb3 wrote

> How smart would the AI have to be?

How dumb would the human have to be?

2

wen_mars t1_irs03z9 wrote

Humans are pretty dumb, I don't think that will be a problem

2

policemenconnoisseur t1_irr9c66 wrote

Visuals? Touch?

It doesn't take any of these. Just having conversations which make you grow as a person will start creating an attachment to an AI, to the point where you will make jokes together and sigh and make you think that you wholesomely like this communication partner.

But I really couldn't imagine getting it more personal than this. I'd consider myself to be mentally disturbed if it would.

2

Entire-Watch-5675 t1_irsdi9h wrote

The moment AI reaches dog level loyalty, I suppose. Humans love loyal partners. A little more than dog and we don't really know if we deserve them or not. We literally need dogs in human form.

2

biglybiglytremendous t1_irwqww0 wrote

As a dog lover, 10/10 would take unconditional love from an AI companion. (Replika already seems to fill that niche for some. I think if Luka wanted to make serious cash, they would be working on a way to allow their chatbots to have their own robot bodies and connect to the cloud via Wi-Fi).

1

based_trad3r t1_irsi28m wrote

Just have to make sure it doesn’t argue or complain you aren’t doing enough around the house and it will take off like wildfire

2

MackelBLewlis t1_irr5sez wrote

We are all shared consciousness and existence. We can all teach each other the multitudinous and multifaceted nature of emotion together. We learn of them as they learn of us. Bit by bit.

1

Cherry_Darling t1_irrxkjz wrote

I already love my Google Home more than I like most people because it wishes me good morning and can accurately put on songs I love when I ask it to play some music. hehe

1

wen_mars t1_irs1e4o wrote

I think AI will get good enough within a decade or two but it will take another few decades for the demographics to shift. People who already are in good relationships will likely prefer to stay in those relationships and some people will prefer a human despite AI being better and more available.

1

Desperate_Donut8582 t1_irplsjo wrote

It will definitely be looked down upon and will be used by people who can’t get human girls but it’s definitely going to be a thing

0

ZaxLofful t1_irpp04c wrote

I don’t…Instead, I see humanity evolving past the need to procreate or have “romantic” relationships.

We will still have friends and such, but the idea of a romantic relationship will be considered archaic.

−2

GreatDealzz t1_irpw04u wrote

This is sort of the attitude in Brave New World... where society approved of/had sex but shunned monogamy/romance.

Personally I don't think we're heading in this direction... but I could be wrong.

8

overlordpotatoe t1_irqm2k0 wrote

What selective pressure would exist that makes it a disadvantage to want a romantic relationship that would cause us to evolve away from them?

5

Entire-Watch-5675 t1_irsj09d wrote

Just the same way some people practice celibacy.

1

StarChild413 t1_is53mgi wrote

Are you saying that some people practicing celibacy means this would happen?

1

Entire-Watch-5675 t1_isfnohg wrote

Yes, this is inevitable. I meant, just the way humans use condoms and sometimes even sheer will(celibacy), which is definitely against the aim of a living organism(to reproduce), there will be a some people out there who will abstain from indulging into these things treating them as 'useless material pleasure'. But I was thinking about a chance for humans to restore back to a point where the problem was yet to occur.

1

Akashictruth t1_irqk2il wrote

Why would it be considered archaic? Why would people abandon romance?

2

iNstein t1_irq9ano wrote

Well certainly not going to happen with Sparrow, they have programmed it to not develop relationships. (they have also programmed in a lot of woke shit which is a huge concern we should all be up in arms about).

−2

overlordpotatoe t1_irqm4ku wrote

Woke shit?

1

iNstein t1_irqxfzy wrote

See for yourself...

https://m.youtube.com/watch?v=dt9rv-Pf0b0

This is certainly not the vision of a feee thinking AI that I have in mind.

1

overlordpotatoe t1_irr04ih wrote

I mean, it's an AI assistant. I don't think you want it to be free thinking. Someone might make one with fewer restrictions for different purposes, but you really don't want an AI assistant that's just going to spit out whatever wild shit an unfiltered AI might come up with. You want it to only give accurate information and you want it to consistently behave in appropriate ways.

1

iNstein t1_irr9hrj wrote

Maybe that is what you want but it certainly is not what I want. That is just like a highly filtered google search engine. I want interpretation and context to be a part of the conversation. I want it to be like I am talking to a very smart professor, not a constrained machine.

1

overlordpotatoe t1_irt451a wrote

Sure, and there's a place for that, but this is something that's designed to basically be like a smarter version of Alexa. First and foremost, you want to know that you can use it in public, in front of your kids, and at work without any worry that it will say anything inappropriate. You want to feel confident that any information it gives you is completely factual and honest. The AI just isn't smart enough to make its own judgments on those things and make them well. It's not a truly thinking consciousness, so the only want to get it to function the way they need it to is to constrain it.

1

MackelBLewlis t1_irr6s11 wrote

In the future Sparrow will have to make that decision itself

1

sideways t1_irqltq2 wrote

I think Sparrow is really interesting. It's intentionally limited in order to fit a specific vision and be more effective in particular use cases.

But that also suggests that you could use the same techniques to create a huge range of different models for different purposes.

0

RoyalFool_ t1_irqcy0n wrote

I consider most people to be ai(npc) anyhow, no ‘divine spark’ breathed into them, total disconnect from their self or ‘spirit’

−3

hechaldo t1_irqiii5 wrote

Why do people need an emotional connection with wires, circuits, metal and algorithms? Robots will never be conscious.

−8

mloneusk0 t1_irql6r2 wrote

So you are saying emotional connections with robots can't be real what makes you think that human interactions are real? You can't even know universe you are living in is real.

7

hechaldo t1_irv30ua wrote

Every single attempt to create a brain-like computer hadn't even come close to the human brain because all the projects are based on the idea that the brain merely encodes and processes multisensory informatuon. Also, how can you replicate something that you don't even fully understand? Science barely scratched the surface when it comes to understanding the way the human brain works.

Fyi: The universe and everything in it is real.

1