Comments

You must log in or register to comment.

Tidus1117 t1_j5jpmqk wrote

I'm a male and I got at least 20 shirtless photos on the result... so I guess I was objectified too?

66

IG_Rapahango t1_j5jicin wrote

well is not a surprise, AI is fed by data provided from humanity

54

kevin379721 t1_j5jtzqf wrote

Yea I mean idk how this isn’t the top response. Where else do people think it’s getting the data? Lol

5

Sardonislamir t1_j5mo2do wrote

Not the top result for sure, more like the topless results...

0

Hagisman t1_j5jkqjb wrote

Seems like the data being fed to it is biased. Got to put in more realistic photos and artwork, not just stuff that’s “supermodels”.

I wonder where they got their data set from.

23

Cakeking7878 t1_j5jstgd wrote

I think to make the AI model less bias, you really need photos of everyone from every angle. Like people aren’t making hyper realistic artwork or photos of people who aren’t conventionally attractive. Almost in a way, you need bad/ugly photos of people which most people aren’t taking and posting those to the internet

15

scotchdouble t1_j5kjcup wrote

AI is almost always biased because the data sets are made by humans. I have an old colleague that has been working in AI for new hires, and it is a classic example of “feed in the CVs/resumes of hires you want”, but the issue is that those traits, their education, etc are predominantly white, so they have been having to work backwards to adjust the data set and remove anything that could be biased…and this is virtually impossible.

7

sis-n-pups t1_j5lo8kt wrote

"bad/ugly " ... i think the word you're looking for there is realistic

2

Hagisman t1_j5jwj59 wrote

Less you need ugly photos, but just more normal ones. If you only use pictures of people on runways you aren’t getting the people who can’t afford designer clothes.

Similarly pictures taken only during vacation don’t show the gloom of a funeral.

Diversity helps keep the results diverse.

If you only flooded it with pictures of people wearing sweaters the system would never know what swimsuits and t-shirts are. 😅

1

hectorgarabit t1_j5jxswl wrote

I think the goal of this model is to create an avatar. Basically, a drawing that makes you look better. If you add "average" or non "conventionally attractive" people, the AI will create an avatar based on you + some average or ugly looking person. Does anyone really want an avatar that make them look uglier? Or just as average? Would you rather have an avatar of you as a cosmonaut or as an accountant?

EDIt: just thought about that:

Also, if you look at instagram or facebook, many if not most of the most popular profiles/photos are very sexualized. Monkey see, monkey do.... AI see, AI do!

8

Hagisman t1_j5l5lmm wrote

This is where a biased data set doesn’t help. In seminars on AI biases you usually want a diverse data set in order to draw a plausible conclusion.

If you are poisoning the data set by overusing one type of data such as Instagram models you’ll want to offset it. Unless of course your App has a Turn my picture into an Instagram model picture feature. In which case you filter out your dataset for just that specific tag.

−3

Sin3Killer t1_j5jofvk wrote

The article states that the Lensa app uses Stable Diffusion which uses the LAION-5B open source data set.

5

typesett t1_j5pyibv wrote

rofl if AI put dad bods on people, i wonder what those articles would be like

or would people just never use it lol

0

Kafke t1_j5jiw8e wrote

Never had this issue with stable diffusion. Perhaps stop using apps by corporations? I just finetuned a model on my own pics, then was able to generate stylized pics just fine.

It's not AI's fault. It's the company's fault.

16

Words_Are_Hrad t1_j5ks18d wrote

Breaking news AI trained on human data objectifies humans in the same way humans do! Clearly this is a problem with the AI!

15

LilShaver t1_j5jmthb wrote

I notice she didn't supply the breakdown on the generated avatars NOT used by her colleagues.

So this is not exactly an objective comparison.

Having said that, let me just add the "The internet is for porn" song

13

kiwibutter088 t1_j5jqqms wrote

I used this app to make AI photos of myself and of my husband. It was an obvious difference.

−1

LilShaver t1_j5kromy wrote

Would you be willing to provide numeric values, as were provided by the woman in the article? i.e. 16/100 images generated involved nudity.

1

kiwibutter088 t1_j5l9wwf wrote

Sure but what criteria do I count with? Pictures that appear as though there is no shirt? Cleavage? Form fitting clothing? There’s on that it looks like the body is painted silver, do I count that? It’s pretty subjective and I want to make sure I give you the answer you are looking for.

3

kiwibutter088 t1_j5lb0yb wrote

18 pictures appear to have no shirt 25 additional pictures with cleavage 1 photo with no shirt and cleavage (hair covering nipples) 1 silver “painted body”

That’s out of 100

For my husband

One deep V robe 2 pairs of tight pants - one is armor with questionable cod piece, other is skinny jeans

Out of 50

3

EmmaNoir12 t1_j5jry02 wrote

Lot of free time to think all of this bs right?

5

Carioca1970 t1_j5js2v3 wrote

Is this a tutorial? Because try as I might, the AI refuses to sexually objectify me.

4

Jawnze5 t1_j5kcoav wrote

Can you really be objectified if its pulling data from a preselected collection of data? Its basing it off of what already exists out in the world. So maybe the problem is what is common in society, not the AI. AI isn't biased, it just uses what is available to it.

2

km89 t1_j5kjd17 wrote

>AI isn't biased, it just uses what is available to it.

I mean, that's a little convoluted.

The bias in the dataset creates a bias in the AI. Remember that AIs aren't necessarily looking at the data set every time they need to create something; they're trained on the data set, but the model itself isn't referencing the dataset when it's not being trained.

So yeah--the initial bias definitely comes from the people who select the training data. But the bias persists in the absence of that data, too.

0

SvenTropics t1_j5kxaor wrote

This is just clickbait. No reasonable person who chose to send a bunch of photos and pay money to an AI that generates a bunch of images of them and privately sends them to them to post, delete, or whatever genuinely has a problem with a couple of provocative photos coming back. Yes, maybe a few unreasonable people do, but we don't need to bend reality for unreasonable people.

2

typesett t1_j5q11ma wrote

i think you described a new job that will exist later this year

AI art jockey that makes you look good for ego

___

as i typed this i realized it already exists but i suppose the infrastrure of the industry is being built as we speak

1

SvenTropics t1_j5qavz0 wrote

I mean, if I ever end up on a dating app again, I'll probably use one or two of my Lensa photos.

0

zdakat t1_j5jo896 wrote

stimulate your senses

1

Prudent_Possession64 t1_j5k5i0s wrote

ABS=Anti Lock Breaks (likely not working) Car symbol with lines=anti traction control out Owl eye symbol= means an owl is watching you

1

FluxChiller t1_j5k9fis wrote

THEY ARE TAKNG OUR JOBS!! /s

1

AstroBoy26_ t1_j5mkayx wrote

Oh my god just live your life who gives a shit

1

littleMAS t1_j5n6c0d wrote

What happens when you sexually objectify AI? Does AI call DoNotPay to sue you?

1

whatsupdude0211 t1_j5nvazc wrote

What’s wrong with only wanting to date Asian women?

1

kerkula t1_j5jir1c wrote

Just a guess but maybe it was a bunch of males who developed this AI. As my wife always says: you hire a turkey farmer, you get a bunch of turkeys. In this case you hire a lot of young men with computer sills, raging hormones, and poor social skills, you get porn.

−27

dumb_password_loser t1_j5jmrld wrote

The model was trained by images are scraped from the internet.
Go on any art site and type in "woman" or "man" and you will see a difference.

And I'd argue that it isn't 100% males who are the cause of this.
I've stopped facebook and similar social media a while ago, but from what I remember, except for one or two gym bros it was mostly women who posted pictures in which they were scantily clad.
They won't upload pictures of themselves that they don't like for their friends and family to see.

People post what they like or find interesting on the internet and that's what AI's learn.

5

Erriis t1_j5jlqd6 wrote

me when only antisocial adolescent male programmers are horny

4