You must log in or register to comment.

Cult_of_Chad t1_j4sgkg1 wrote

I don't understand why this is so contentious, it's not like the people falling in love with AI are going to be missed from the sexual market. They're getting cause and effect backwards here.

I'm going to break with common sentiment here and say humans both deserve and need to know what it feels like to fall in love. In that vein I see AI waifus as a potentially viable therapy.

Even if we achieve full cognitive and morphological freedom, so no one is barred from mating by physical deformity or neurodivergence, some people might still choose not to change or partake at all. And that's valid too.


giveuporfindaway t1_j4tlr4y wrote

>I don't understand why this is so contentious, it's not like the people falling in love with AI are going to be missed from the sexual market. They're getting cause and effect backwards here.

This. A large (growing) segment of people are treated like human wallpaper. They are invisible to everyone. If they die nobody will ever know they were even here. They have not existed on anyones radar for decades. Yet suddenly there's a concern that these same people are giving up a life time of casonova dating opportunities?


KSRandom195 t1_j4ux76s wrote

The real concern is probably the expectations that get set. If “the average woman considers 80% of men below average” then 30% of women won’t find a man up to her standards and will have to lower their expectations if they want a partner. If an AI is considered at least average, that lowering of expectations doesn’t occur as the AI can fill that gap.

And of course, Futurama had an entire episode on this problem.


Cult_of_Chad t1_j4vvoby wrote

Why should people settle? Instead we should be aiming to give everyone as much morphological freedom as we can. That way we can pick partners based on social and sexual compatibility and just adjust ourselves to be attractive to the person we're pursuing.


KSRandom195 t1_j4wecsh wrote

Why would you bother pursuing someone and the headache that ensues if you can just push a button to make a robot do whatever you want?


Cult_of_Chad t1_j4whisv wrote

Some people enjoy dating. Some people want to have children the old way. There's also taste and smell. Most importantly though, some of us don't get off to submissive sex slaves. The idea that I'd want a robot that 'does what I want' is such a cishet dude take. That's the opposite of what I look for in a male and in hardly the only one.


KSRandom195 t1_j4wze16 wrote

I didn’t even say it’d be a sex slave. Odd how you assume that’s where I was going with that.

It’s more like, I don’t need to get in a fight to get my partner to do the dishes. Depending on how advanced the AI gets maybe we do end up where you end up fighting with an AI partner over who does the dishes. But I imagine that we will end up in a place where you could “configure” your AI partner, which would make it far easier to get along with them. There’s no way you could do that with a human partner.

Yes that would eliminate the thrill of the chase involved in the dating scene. But I’m kind of past that now, and it sounds like a lot of people are. Not to mention the standards being set by women are such that over 30% of them won’t find a partner unless they drastically lower their expectations.

Also, if you really want, I’m sure we could “configure” the AI to not be your sex slave and not just do what you want.


armentho t1_j4z213u wrote

Because settling usually means lowering unrealistic expectations (like accepting your partner migth lack the best attractive appareance,or be a bit stupid on a topic you like,etc) minor to moderate issues rather than deal breakers that can be dealt with with communication,cooperation and compromise

Not "date the worst possible human being imaginable"

Thats 2 very different outlooks what settling means for some people


Cult_of_Chad t1_j4z34bv wrote

>Because settling usually means lowering unrealistic expectations

Look, I'm married, I know how it works.

But there's a huge difference between trying to make the best of things you can't change in the here and now. And creating a world where our children maybe don't have to make as many hard choices as we did when attempting to win a mate.

Besides, if it ends up being that humans only have sex with sexbots and our babies are grown in clinics or inside a tree or some shit, that would be fine too. We'll be fine as long as we aim for resilience and growth.


walkarund t1_j4uf6qc wrote

Yep. It's a bit nonsensical to worry about AI because they lacks (at the moment) of "real personality" when the alternative is being absolutely alone, with a good chance of being depressed with low self-esteem.


Desperate_Donut8582 t1_j4spqdq wrote

I imagine AI waifus would be roasted and be looked down upon even worse than furries in the future if that happens


HearthstoneOnly t1_j4t1snb wrote

I honestly doubt it. AI as a sex toy will likely seem less contentious than porn. Even AI for just intimacy seems like a step above OnlyFans users.


Cult_of_Chad t1_j4sr7ii wrote

Do you think furries care? Anti-furry sentiment is basically homophobia for zoomers and millennials; sexual minorities live with abuse every day.

People falling in love with AI are so far out of the mainstream that social approval is irrelevant.


jeffkeeg t1_j4srora wrote

>Anti-furry sentiment is basically homophobia for zoomers and millennials

Lmao gtfoh


Cult_of_Chad t1_j4srwml wrote

The furry community is, disproportionately, young gay and bisexual males.


jeffkeeg t1_j4suvmd wrote

So what? Baseball players could be disproportionately gay, but thinking baseball is stupid wouldn't make you homophobic.


Cult_of_Chad t1_j4sw0go wrote

I think foot fetishes and anime waifus are stupid and weird. Doesn't make me want to insult and degrade people into it with the virulent hatred furriest get.

Hatred against furries is in large driven by disgust with homoeroticism and male sexuality Just because you can't see it doesn't mean it's not true.


EnomLee t1_j4x4hd9 wrote

Don't tell them that Chad. They aren't ready to hear it!


TheTomatoBoy9 t1_j4tux1r wrote

And into beastiality


Cult_of_Chad t1_j4v75j7 wrote

As opposed to average straight male sexuality, where we pretend so many of them aren't attracted to teenage girls or worse. Or women and their rape fantasies.

We should judge people by their actions, not their thoughts or fantasies.


TheTomatoBoy9 t1_j4xc3pp wrote

>attracted to teenage girls or worse.

>not their thoughts or fantasies.

Wait, so which is it? Are fantasies OK or bad?


EnomLee t1_j4x6gr7 wrote

No. That's just the lie that people like you tell yourselves so you can pretend that you're doing something righteous by targeting a powerless out-group. Belittling and bullying people for not being as boring "normal" as you are would immediately show everyone exactly the kind of person that you are and you can't have that! But, nobody will judge you if you can make your victims into the villains.

I bet you actually believe Republicans when they say that trans and gay people are groomers, don't you? So foolish. So, so, gullible.


Desperate_Donut8582 t1_j4svhc8 wrote

I never said I hated furries but I'm assuming they don't care but not good for mental health


Cult_of_Chad t1_j4swi55 wrote

Furry fandom provides community which is better than having none. Most of the guys falling in love with AI were never going to have human attention anyway; a way to express and feel those emotions is better than none.

How can having no outlet be better than having some?


Nervous-Newt848 t1_j4tmaup wrote

What's wrong with people falling in love with AI, at least AI won't cheat on you or throw you in jail for false claims.

I think AI has more potential than human relationships because you dont have to search for someone who is compatible mentally, is attracted to you, likes your personality... For a relationship to last long your partner needs to be a lot of things and its hard to find all those things in one person for many people...

You can literally create the perfect person for you with AI... Its a revolution


SupPandaHugger OP t1_j4u4rzi wrote

Did you read the article? The arguments are all there.


Nervous-Newt848 t1_j4xvv2b wrote

The arguments are not good. It's good for people to talk to someone even if that person is not real. It can help people mentally.

AI right now is primitive anyway, in the future we will have actual synthetic minds in our pocket. AI chatbots will be something else entirely 10 years from now

I'd rather date an AI than trying to search for some girl on tinder... Or wasting years of my life on abusive partners...


Ortus14 t1_j4t0coc wrote

The article is more about the relationship limitations of current Ai, rather than future Ai.

In the future you will be able to ask your Ai partner to teach you relationship skills and conflict resolution and not be a "yes man" or "yes woman", if that is your goal.


PhysicalChange100 t1_j4t88pq wrote

Agreed, if i ever were to have a relationship with an AI then i would want it to challenge my bullshit.


DadSnare t1_j4t84uf wrote

I feel like the augmentation of relationship skills in any moment, and connecting people to each other with intelligent assistance could be the greatest gift of AI.


SupPandaHugger OP t1_j4u4niy wrote

How? They still have to be given instructions to have certain objectives and won’t go through life experiencing things like humans would. If AI wouldn’t be yes people then the initial appeal would also be gone.


Ortus14 t1_j4ufhfq wrote

There's lots of conversational data for Ai to learn from such as all the big tech in our homes that records our conversations (Alexa, Google Assistant, Siri, etc.), as well as platforms and social media that records conversations, and other future devices such as smart taxis, and home security systems that will also record our conversations.

As far as the appeal, in this case the goal would be using the Ai to teach people how to have better real world friendships/relationships, specifically for people who don't have enough relationship and social skills to be able to get practice with a real person yet.

This is not a goal I made up. I got this from how the article was framed.


SupPandaHugger OP t1_j4ungkr wrote

Sure, but the data is not the problem. The problem is that it has no will on its own, it needs nothing and will only do what it's told.

Yes, that makes sense, to teach/coach people to improve socially. But this isn't a relationship with AI, more so a tool to improve relationships with real people.


danellender t1_j4s3ahb wrote

Come on. We don't need capital AI to fall in love. I know many people who are infatuated by their current mobile phones.


armentho t1_j4to6mg wrote

Issue is

People that would fall in love with AI were already fucked to begin with

Taking away AI isnt gonna make them more likely tl date so putting the blame on AI is worthless


SupPandaHugger OP t1_j4u4ypu wrote

As I wrote though, I think it can. I believe it could reduce social ability since it functions as an echo chamber.


TerrryBuckhart t1_j4t0s3t wrote

Dependency on AI will breed intelligence that’s artificial.

Go Figure.


Maksitaxi t1_j4u9pqj wrote

This is only true for today. It will get much better very fast. I have tried most things and character ai and chat gpt is blowing my mind. The problem of understanding will be fixed when we get agi.


rushmc1 t1_j4v9ha2 wrote

See companion article: Why Falling in Love with Other Humans is a Dangerous Illusion...


Alone-Marionberry-59 t1_j4urrd1 wrote

I think this underestimates AI and may have been written by someone that never interacted with a newer one.


cbpn8 t1_j4un6ca wrote

The ultimate Turing test


Glitched-Lies t1_j4vas9n wrote

Because it's not conscious


Nervous-Newt848 t1_j4xy1w9 wrote



Glitched-Lies t1_j4y7439 wrote

No, it's likely never except on very small scales. To say otherwise would be false promises. Or a flat out lie.


Nervous-Newt848 t1_j4yj1lf wrote

Get ready to eat those words buddy. You're in for a real awakening pretty soon. Hopefully you live long enough to see it.


Glitched-Lies t1_j4yjxc9 wrote

Then you must be in for an even shock to find that none of these "companies" are working on conscious AI. In fact nearly none of them are. And AI doesn't just "become" conscious.

The only conscious AI to exist would be a spiking neural network of brain emulation and cognition. None of these idiots do that. And some don't even know what that is.


Nervous-Newt848 t1_j4xwpox wrote

Replika is primitive right now but in 10 years it'll be insane.

Dont focus on negative things this creates negative people.

Focus on positive things this will create positive people.

I'm so tired of negativity online, leave people alone. Everyone has different paths to happiness it doesn't have to be like yours.


Virtafan69dude t1_j4ysg67 wrote

Ironic that this was probably mostly generated by GPT3 from the way it reads haha.


SmoothPlastic9 t1_j4zcwml wrote

Ehhh this generation is already fucked AI isn’t a problem


BassoeG t1_j4t5k92 wrote

Has the population dropped sufficiently to put us at risk of a Saturn's Children scenario with humanity accidentally driven extinct by sexbots? No, to the contrary, we're actually overpopulated in terms of available resources vs resource consumption? Fearmongering.


Cult_of_Chad t1_j4tq2vb wrote

That's backwards. The fertility crisis is a credible threat to technological civilization.


OldWorldRevival t1_j4s29zc wrote

"Forward thinkers" will be duped by these tools... like moths to a flame... useful fools.

The brightest of minds take great caution, Stephen Hawking among them. The seek out specifics, nuances and details.

The AI can just as easily be used for power and control. The first step is to addict and lure, create dependency. In a way, this part may already be done.

Google, Microsoft, untouchable giants, providing things no others can provide in the same way.

Also the ones developing this tech. We already have seen the deranged power of media unchecked.

Then, how powerful should a controller of these tools be, able to touch all industry?

I am not optimistic, not at all.