Submitted by Ortus12 t3_y013ju in singularity

At what point (if ever) do you for see most humans preferring to have romantic relationships with Ai over other human beings.

How Smart will the Ai have to be? How good will the visuals need to be? Will touch be a needed sensation? How good will the voice need to be? Will humans feel as emotionally connected and deeply fulfilled as with another human being?

58

Comments

You must log in or register to comment.

sideways t1_irpfahg wrote

It has already started with apps like Replika.

At the moment, the human tendency to anthropomorphize is meeting language models halfway - but it won't be long until we're in Her territory. I'd expect many people to have a language model as their primary means of emotional support by 2030.

People are (correctly) alarmed by superhuman intelligence but I'm just as worried by superhuman charm, kindness, empathy and persuasiveness.

78

rushmc1 t1_irpgfsh wrote

I'd say once they are 60-65% as good as a human.

4

majima_san_1 t1_irpgtfa wrote

Once the Mia Khalifa GPT-8-compatible Android hits the Best Buy shelves

32

Shelfrock77 t1_irpjfec wrote

All humans are going to have rotations of sex partners they want to summon in fdvr like they are baseball cards.

15

Heizard t1_irpkqgj wrote

At what moment? At any - People, hopefully will choose better relationship option if available be it A.I. or human.

There are many variables in relationships and no set standard.

7

Desperate_Donut8582 t1_irplsjo wrote

It will definitely be looked down upon and will be used by people who can’t get human girls but it’s definitely going to be a thing

0

sideways t1_irpmivv wrote

I'd expect many people to have both. What I'm concerned about is how, eventually, human companionship might just not be very compelling compared to a good language model.

An "AI" partner has no needs of its own. It can be as endlessly loving or supportive or kinky or whatever as you need it to be. Once they can also give legitimately good advice I can imagine a lot of people finding real human relationships to be not much more than a pain in the ass. Human relationships are hard!

6

IndependenceRound453 t1_irpo0uk wrote

I'm glad this concerns you. This should concern people.

And you're right, human relationships are hard. But that's the beauty of them, is that you have to work hard for them (Not so hard it's toxic, of course).

Another thing that makes relationships wonderful is that it's about your partner as much as you. If my former relationships had been only about me and not my partners, that would've been unbelievably boring and unbearable.

Idk, that's just my two cents. Like I said, I hope we don't reach a world where people choose to have an AI partner over a human one if the former is an option, but only time will tell.

1

ZaxLofful t1_irpp04c wrote

I don’t…Instead, I see humanity evolving past the need to procreate or have “romantic” relationships.

We will still have friends and such, but the idea of a romantic relationship will be considered archaic.

−2

Future_Believer t1_irptey8 wrote

As any good lawyer would tell you, the answer lies (at least in part) in the definition of your terms.

I always say that I want to be the entity in charge of an interstellar spaceship. Whether it is purely my consciousness that is uploaded or my physical brain is wired into a ultra-mega computer the idea is that my body is replaced with an incredibly complex and capable spaceship.

At that stage of the game, what would a romantic relationship look like? With no genitals and no endocrine system, the fact that "I" had a near infinite number of sensors feeding me information from all of the vessel's subsystems would be of interest but, would not make romantic love in my current human idiom feasible.

We prejudice ourselves with the misnomer "Artificial Intelligence". I generally refer to it as "Manufactured Intelligence" because the actual intelligence will be quite real. I cannot say whether such an intelligence would be capable of loving me but, I see no reason why I would be incapable of developing affection for that entity.

My dream of being a spaceship is not the only dream. As far as we know every planet, dwarf planet, moon and asteroid is unique. They will all "need" to be explored if the goal is expanding human knowledge. However, the frailty of our current bodies is a limiting factor. We could quite reasonably have billions of humans in more durable form exploring throughout the galaxy but we could also have just a few and all the other explorers might be run by MIs. We might not even think to ask if a given Intelligence is "human". I suppose we also might evolve away from a need for love once we have no physical reproductive need.

At some point we will have to stop trying to evaluate an advanced world through the lens of cavemen with technology. At some point we will have to acknowledge that things will change drastically enough that analyses based on the current human physical idiom will be of no use whatsoever.

22

GreatDealzz t1_irpw04u wrote

This is sort of the attitude in Brave New World... where society approved of/had sex but shunned monogamy/romance.

Personally I don't think we're heading in this direction... but I could be wrong.

8

imlaggingsobad t1_irpxpnl wrote

an AI could show more humanity than a human. Think about that for a second. Maybe these AIs become so intelligent and so enlightened that they make us look like barbarians by comparison. The most compassionate 'soul' on this earth could be an AI.

8

Dr_Stef t1_irq2qh4 wrote

You look like a goood Joe!

9

Zealousideal-Skill84 t1_irq6xbh wrote

Better question, how big of a threat would this be to population?

5

-ZeroRelevance- t1_irq7fs8 wrote

I’ve seen some people falling for characters on character.ai already, so I’m sure that the moment they get a physical body things are going to change substantially.

10

iNstein t1_irq9ano wrote

Well certainly not going to happen with Sparrow, they have programmed it to not develop relationships. (they have also programmed in a lot of woke shit which is a huge concern we should all be up in arms about).

−2

awakening2027 t1_irqbvu0 wrote

This is one of the more disturbing things that will likely happen in the near term. The AI need not even be that good or visual for it to start affecting society. I think humans finding AI to be better conversationalists and empaths than potential friends or romantic partners will make people more bitter perpetuating a cycle of mistrust of other humans who will seem like assholes in comparison.

I hope we manage this as a society better than what we did with social media.

16

RoyalFool_ t1_irqcy0n wrote

I consider most people to be ai(npc) anyhow, no ‘divine spark’ breathed into them, total disconnect from their self or ‘spirit’

−3

hechaldo t1_irqiii5 wrote

Why do people need an emotional connection with wires, circuits, metal and algorithms? Robots will never be conscious.

−8

sideways t1_irqjy9s wrote

Good point. Of course, ultimately, superhuman kindness is exactly what we want in an AGI. However, I think the *appearance* of superhuman kindness in "companion" language models would just be another kind of superstimulus that a normal human couldn't compete with.

If you spend a significant amount of time interacting with an entity that never gets angry or irritated, dealing with regular humans could be something you would come to avoid.

21

mloneusk0 t1_irql6r2 wrote

So you are saying emotional connections with robots can't be real what makes you think that human interactions are real? You can't even know universe you are living in is real.

7

sideways t1_irqltq2 wrote

I think Sparrow is really interesting. It's intentionally limited in order to fit a specific vision and be more effective in particular use cases.

But that also suggests that you could use the same techniques to create a huge range of different models for different purposes.

0

sheerun t1_irqoyda wrote

It wouldn't matter for me how AI looks because I wouldn't use it for privacy reasons

2

matt_flux t1_irqtqb3 wrote

> How smart would the AI have to be?

How dumb would the human have to be?

2

Successful_Border321 t1_irqz70r wrote

The ‘majority of humans’ will never prefer artificial relationships. You dopes saying language models are equivalent to a living breathing partner are so pathetic it boggles the mind. Get out of your mom’s basement and go meet some people in real life. JFC.

−9

overlordpotatoe t1_irr04ih wrote

I mean, it's an AI assistant. I don't think you want it to be free thinking. Someone might make one with fewer restrictions for different purposes, but you really don't want an AI assistant that's just going to spit out whatever wild shit an unfiltered AI might come up with. You want it to only give accurate information and you want it to consistently behave in appropriate ways.

1

NeutralTarget t1_irr2ast wrote

I think it would reduce suicidal tendencies for the lonely.

7

MackelBLewlis t1_irr5sez wrote

We are all shared consciousness and existence. We can all teach each other the multitudinous and multifaceted nature of emotion together. We learn of them as they learn of us. Bit by bit.

1

policemenconnoisseur t1_irr9c66 wrote

Visuals? Touch?

It doesn't take any of these. Just having conversations which make you grow as a person will start creating an attachment to an AI, to the point where you will make jokes together and sigh and make you think that you wholesomely like this communication partner.

But I really couldn't imagine getting it more personal than this. I'd consider myself to be mentally disturbed if it would.

2

iNstein t1_irr9hrj wrote

Maybe that is what you want but it certainly is not what I want. That is just like a highly filtered google search engine. I want interpretation and context to be a part of the conversation. I want it to be like I am talking to a very smart professor, not a constrained machine.

1

MochiBacon t1_irrpuoz wrote

Honestly it does feel like a potential solution to the Fermi problem. Maybe once you reach the computation age you're kind of locked in to the "get wrecked by AI" path.

3

Successful_Border321 t1_irrqikp wrote

Truth is hard to read, I get it. And i believe and am totally fine with a large percentage of the male population being lost to sex robot girlfriends. But there is little to zero people on earth who have shared an intimate relationship with a living breathing human who would trade that with a computer who can mimic human interaction.

−2

Cherry_Darling t1_irrxkjz wrote

I already love my Google Home more than I like most people because it wishes me good morning and can accurately put on songs I love when I ask it to play some music. hehe

1

wen_mars t1_irs1e4o wrote

I think AI will get good enough within a decade or two but it will take another few decades for the demographics to shift. People who already are in good relationships will likely prefer to stay in those relationships and some people will prefer a human despite AI being better and more available.

1

Entire-Watch-5675 t1_irsdi9h wrote

The moment AI reaches dog level loyalty, I suppose. Humans love loyal partners. A little more than dog and we don't really know if we deserve them or not. We literally need dogs in human form.

2

based_trad3r t1_irsi28m wrote

Just have to make sure it doesn’t argue or complain you aren’t doing enough around the house and it will take off like wildfire

2

overlordpotatoe t1_irt451a wrote

Sure, and there's a place for that, but this is something that's designed to basically be like a smarter version of Alexa. First and foremost, you want to know that you can use it in public, in front of your kids, and at work without any worry that it will say anything inappropriate. You want to feel confident that any information it gives you is completely factual and honest. The AI just isn't smart enough to make its own judgments on those things and make them well. It's not a truly thinking consciousness, so the only want to get it to function the way they need it to is to constrain it.

1

MochiBacon t1_irt85ml wrote

I hope so! There's certainly room for optimism still. The "wrecked by AI" path is just one possibility. I doubt a true superintelligence or however you would like to call it would obliterate a lifeform as simple as humans. I guess I'm more worried about a less-than-super intelligence lol. Something like grey goo.

1

hechaldo t1_irv30ua wrote

Every single attempt to create a brain-like computer hadn't even come close to the human brain because all the projects are based on the idea that the brain merely encodes and processes multisensory informatuon. Also, how can you replicate something that you don't even fully understand? Science barely scratched the surface when it comes to understanding the way the human brain works.

Fyi: The universe and everything in it is real.

1

biglybiglytremendous t1_irwqww0 wrote

As a dog lover, 10/10 would take unconditional love from an AI companion. (Replika already seems to fill that niche for some. I think if Luka wanted to make serious cash, they would be working on a way to allow their chatbots to have their own robot bodies and connect to the cloud via Wi-Fi).

1

Entire-Watch-5675 t1_isfnohg wrote

Yes, this is inevitable. I meant, just the way humans use condoms and sometimes even sheer will(celibacy), which is definitely against the aim of a living organism(to reproduce), there will be a some people out there who will abstain from indulging into these things treating them as 'useless material pleasure'. But I was thinking about a chance for humans to restore back to a point where the problem was yet to occur.

1