Submitted by Soft-Flamingo6003 t3_10rvrb7 in singularity

We’re witnessing the rise of Deepfakes and people affected by increasingly easier to use and abuse ai tech. As the cases of influencers like QtCinderella being the latest victims of Deepfaked porn videos, it isn’t much before this spreads further and everyone, public figures or not is targeted. The implications of this is insane. As a woman, now I fear to post any single picture of myself online. I fear for my little sister who’s in high school. I fear interaction with each man in my life, maybe he is nice to my face but what if he turns around and deepfakes my selfies onto something asinine like that?

What is a way to combat this without going off the grid and living like a hermit? Is there anything like a software that can be used to manipulate pixels in pictures that would make it difficult or least leave a watermark or a trail.

“Software that inserts specifically designed digital “artifacts” into videos & pictures that conceal the patterns that face detection software uses” Found this quote online on protecting against Deepfakes, would like know if such tech & softwares are currently available.

9

Comments

You must log in or register to comment.

jeffkeeg t1_j6yfjov wrote

Why is everyone pretending like deepfakes are an epidemic all of a sudden? We've had this tech for five years and the world has yet to catch on fire.

7

Spire_Citron t1_j70r0nt wrote

Just because it's been around for that long doesn't mean it hasn't improved, become more accessible, started being used more, etc.

2

inquiringatheist t1_j6zogwf wrote

Sometimes it takes time for the world to catch on fire, especially when you're starting with a baseline where increasingly abusive conventional porn is normalized.

1

BigZaddyZ3 t1_j6ye367 wrote

Deepfakes will have to be handled at the legal/government level. They’ll have to be made illegal to possess or distribute similar to how we currently treat ch*ld-abuse images. Not much individuals can do at the moment. (Other than “hermit-mode” like you said. But who wants that right?)

If it makes you feel better tho, this will most likely happen fairly quickly once powerful people start to comprehend the true-scope of issues that this tech could cause if left un-relegated. It’s all fun and games until some congressman finds out that creeps are making deepfakes of his 11-year-old daughter. Then you’ll see both the left and the right unite on a war-path to get this type of tech under-control. So we’ll just be patient for now. Deepfakes will most likely be a temporary issue before society wakes the fuck up and comes to its senses.

6

Redditing-Dutchman t1_j6ziv41 wrote

Reasonably nothing can be done, except perhaps harsh punishments for people spreading deepfake stuff. But on a personal level it's impossible to regulate.

4

Mrkvitko t1_j6zqz7j wrote

There's no chance we'll get ban on this sooner than these algorithms get open sourced. Not to mention it has legit uses. Really, what is going to happen is it will be even harder to get to a truthful and real content.

What is going to happen is we won't be able to trust every recording we see or hear, much like we do with text. That's it.

2

BigZaddyZ3 t1_j6zrxwf wrote

They don’t even need to ban the tech. Just certain uses of it. All they’d have to do is make it highly illegal to possess or distribute deepfake images without expressed consent from the people being depicted.

2

Mrkvitko t1_j6zvc8o wrote

Because it certainly worked and CSAM is no longer a thing, right?

−1

BigZaddyZ3 t1_j6zvtzj wrote

Do you not think that the proliferation of it has been significantly reduced to the point where it’s extremely rare and unusual to meet someone that possesses it now? Be logical. If they can get deepfakes to that point, that’s a W. No matter how you try to spin it.

3

Iffykindofguy t1_j6xwild wrote

Negative ghostrider. Time to stop posting pics except privately to your trusted circle until something comes along. Also vote blue so that we can get more laws in more states on the books about this sort of thing. Sorry this is on your mind, its a reasonable fear (that someone you know may do this, not that you'll just get caught up in some massive web of deepfakes) but unfortunately there doesn't appear to be a reasonable solution yet.

5

Soft-Flamingo6003 OP t1_j71a8ot wrote

Aright but how far is willing willing to take this logic. Don’t want to get harassed on the street? Don’t go out into the street. Don’t want to get catcalled? Wear a burqa. Putting the burden of the abuse onto the victim rather than the perverts isn’t the way to go.

P.s I’m not in America (for the vote blue comment). I’m from a 3rd world society in a still patriarchal country. I do identify left.

1

Iffykindofguy t1_j71u6l0 wrote

I mean I agree with you that it shouldn't be happening but it has nothing to do with willing? Its just fact. Im not telling you to shut up and put up with it, just that what you're looking for doesn't exist yet so either do something about it or adapt. Im just answering the question you asked.

1

inquiringatheist t1_j6zo4n2 wrote

Here are some ideas of the top of my head:

  • Lawyer up. Prosecute deepfake porn creators. Build class action lawsuits. Form organizations that pay for prosecution.

  • Legislate. Get the attention of communities, government, law makers. Get bills drafted and passed. Ensure that the laws are enforced.

  • Educate. Spread awareness of the issues. There is a clear target demographic for people who might end up using deepfake porn. Reach them and have them understand why it's so disturbing to have this happen. Guide men and boys towards feminism. Teach girls and women about the technology, give them resources to educate themselves.

  • Protect. Build tools to find deepfake porn so that it can be reported and deleted. Make these tools cheap and easy to use. For potential targets (streamers, influencers, youtubers) who would rather like to be faceless, give them options like virtual avatars (vtubers) and make those options cheap and easy. Work with social media platforms to spread these tools.

4

Mrkvitko t1_j6zqbm6 wrote

I don't want to sound rude, but maybe you should try to relax, or if it gets bad, therapy.

Yes, there is a chance someone will create some deepfake porn of you. And guess what, it doesn't have to be man :)

No, there isn't any way to prevent this, other than not having photos/videos/audio of you available at all.

Yes, the implications are insane... But not in a way you think they are.

Nothing will be trustworthy anymore. The politics would turn into much bigger shitshow than it is now. The news reporting will be plagued by fakes. But everyone's privacy will be much better than before.

Your ex leaked your nudes? Nobody will be able to say if they are legit, or if it's just a deepfake. A video of you jerking off got online, because you forgot to put a tape over camera? Surely that must be a deepfake. Audio of you drunkely confessing to infidelity? Deepfake.

If someone digitally stitches your face onto someone else's body in a porn movie... Why should you even care? There's no reason to worry.

2

Phoenix042 t1_j70melw wrote

An app could add a unique key signature as an invisible watermark to any picture captured by a user. That key could contain various meta-data, such as an ID of the app that created it, the date and time of creation, a unique user ID, etc. Before being embedded in the image, it would first be salted and hashed like any password should be, which guarantees that no software could spoof a particular key or reverse engineer the data it contains from the image containing it.

It could then be checked against a hash lookup by the app that created it (using a stored private key) to verify the authenticity of an image.

This could prove that a picture used as evidence in a court case is not a deepfake, though it does nothing to prevent people from creating deepfakes that do not have such watermarks.

But if this became ubiquitous, it could effectively stop deepfakes from being mistaken for authentic pictures. If every camera app automatically watermarked every photo it takes with a unique hashed key, then any photo taken by a particular app could be verified against some database somewhere containing the private key used to decrypt the key in the watermark.

2

Ragondux t1_j6xxdwg wrote

There can be temporary protections or ways to detect deepfakes, but ultimately that battle will be lost. Tools like StyleGAN can produce faces of people that don't exist, so even if there is no photograph of you anywhere online, it can produce your face or something close to it.

I'd say our only hope is that we will lose interest in sex tapes when we're at the point we could assume they're all fake anyway.

Another way would be to use something like a Blockchain to prove videos are real. You wouldn't be able to prove when they're fake, but you could claim they're fake if they're not in the Blockchain at the appropriate time.

1

Spire_Citron t1_j70r8qy wrote

The thing is that the possibility that the sex tapes are fake won't make people lose interest because there were people who were paying for those fakes to be made. They don't care that it's fake, they just want custom porn of a particular person.

2

Ragondux t1_j70t26x wrote

People will still want them, but hopefully they won't destroy someone's life.

2

Spire_Citron t1_j70vb88 wrote

It's not just people believing they're real that causes harm, though. Having pornography made of you against your will victimises people in and of itself even if everyone knows it's not real. Everyone knew the deep fake porn videos in the latest incident weren't real, but it was still deeply upsetting for the women whose likenesses were used to find out it had been going on.

2

Mrkvitko t1_j6zrg7t wrote

There's no way to prove video is real with something like blockchain or anything like that. In absolutely worst case, you can always emulate sensor in a real camera...

1

Ragondux t1_j70tjs4 wrote

Sure, there will be a need to trust the source, but you can still have some level of proof. Imagine that some politician says something horrible on camera. Then he decides to claim it's a deepfake. Using a blockchain you could prove that the video existed at a given date and had been claimed true by MSNBC, for example.

It's not great but I think it's the best we can do in the near future.

1

Mrkvitko t1_j710hws wrote

Which will give you nothing. It won't prove the video is a deepfake or not. Not to mention you're unlikely to do the verification by yourselves (generally, each time video is uploaded somewhere it is recompressed which changes its checksum). So you're relying on some "trustworthy institution" anyways.

In that case, you can drop the blockchain and just check what the institution says.

1

scientist_rony t1_j6xy1pe wrote

Yes there are tricks that can fool face detection softwares using deep learning based computer vision. Examples include wearing clothes with certain patterns and introducing digital noise (e.g Gaussian) in the images. Adding digital noise will not change the image for humans but for the AI (which will be unable to either recognise faces or get fooled). This will certainly confuse any GAN based deepfake software. However, this is a rapidly changing field and models are getting better everyday. So something that works today might not work tomorrow. Also new generative models based on Diffusion processes (e.g. Dall E, Stable Diffusion) may not be fooled by these tricks.

1

XGatsbyX t1_j6z4lhr wrote

The thing about deep fakes is it also creates “plausible deniability” the first wave will damage lives because people (public) will mostly believe they are real, then they will be commonplace and most will assume they are fake until proven to be real. Like a Rolex or a Fake LV handbag. So in the Porn example just assume everyone is going to have deep fake porn made using thier likeness so the social effects of such will be greatly diminished and easily understood, like catfishing or identity theft today it’s likely you have been a victim or know someone who has.

1

Spire_Citron t1_j70rcr9 wrote

It doesn't stop the harm, though. Being put in porn against your will is horrifying even if it isn't real.

1

ImoJenny t1_j6znwsw wrote

Oh you think that's bad? The right-wing in the US loves fabricating evidence of imagined crimes supposedly committed by their favorite punching bag minorities. I fully expect to see deepfakes of myself saying or doing fucked up shit within the decade based on how many people accost me at the bar and accuse me of being a sexual predator just because I'm trans.

1

Soft-Flamingo6003 OP t1_j70jcln wrote

Yes. This is definitely a concern as well.I am sorry that you’re been through that but I can only forsee danger as this tech becomes more widespread 😫

3

ImoJenny t1_j70jhdi wrote

Hey, we'll all get through it, and honestly I think I probably came off as dismissive of your concerns. Sorry about that.

2

Kinexity t1_j6zz4ur wrote

Post your pictures only in place where only your friends can see them or don't post them at all - ask yourself if you need to even post them in the first place. Maybe choose to show your photos to your friends during meetings instead. The only way for a photo to not be usable by some future AI is to turn it into random noise which defeats the purpose of posting it.

> I fear interaction with each man in my life

Paranoia speedrun 101. A creep will get your photo anyways. Projecting your fears on random men will only be detrimental to your mental health and social relations.

1

nicka163 t1_j70dd5g wrote

Deepfake pornography has been around since the creation of photoshop. If you are not a wanna be celeb or a public figure, it’s not something you really need to worry about. This is classic alarmism

1

Spire_Citron t1_j70rpr4 wrote

Until it becomes something anyone can do with no effort. Sure, it's been possible for a long time, but the barrier of entry was so high that who would bother?

1

Nmanga90 t1_j742853 wrote

No it’s too late lol. It’s advanced to the point where very accurate 3d models can be made from 2 pictures. Look up NVIDIA NERF

1

GenoHuman t1_j742u8d wrote

I don't think there is much to do about it and even then the entire internet will be filled with AI generated pornography of both real and fictional people soon enough.

1

socialkaosx t1_j6xxwon wrote

I think that when computers and photo-shopping tools came along, people said the same thing

"As a woman, now I fear to post any single picture of myself online" -why do you publish your photos online at all? You give it to the whole world, and suddenly you have a problem with it?

besides, what you are writing about is an extreme case, and we deal with such cases through the courts. i don't think anything will change in this matter.

0

SentientBread420 t1_j6zw83w wrote

AI is far more powerful, efficient, and effective than Photoshop, and these tools will eventually become even more effective and accessible.

3

Redditing-Dutchman t1_j6zihvo wrote

Yes you are correct. There are already many websites that offer deepfake options, and since you can upload any picture, and any video that you want to replace the faces in, people can upload whatever, and whoever, they want.

I'm quite happy that I'm not in school anymore. Cyberbullying will be so easy this way. School is such a fast moving social environment that even if you had detectors for deepfakes, nobody would care. Damage would already been done.

0

Smellz_Of_Elderberry t1_j70fo0o wrote

The problem in part is that we view sex as some taboo that needs to be hidden. In the future, it's going to be even more open.. Video porn has already largely desensitized ppl. Long gone are the days of showing ankle as being sexually risqué.

Deep nude is a thing and what do you think that technology will be more likely to do? Be used as a weapon? Or used to the point that people stop being so worried about something as inane as nudity.. Large parts of the world don't care about it.. have communal baths, or outright just dont view nudity as sexual.

This stuff will result in culture shock and a radical change in generational norms. Your generation freaked out when Jill had her nudes leaked. The ones coming after deep fake will just not care, and will think you a weird old person with weird cultural beliefs when you do. They will look at you and I the way we looked at our great grand parents demanding we finish everything on our plates, because they were raised by a generation alive before the creation of refrigeration..

−2

Spire_Citron t1_j70rlcz wrote

I don't think people will ever just get used to and be cool with people putting them in pornography against their will. Maybe the people consuming it won't think anything of it, but the people who are victimised by it sure will.

2

Smellz_Of_Elderberry t1_j71svc1 wrote

I don't think you understand what happens when something becomes as easy and quick as photoshopping a ducks head onto someone..

>I don't think people will ever just get used to and be cool with people putting them in pornography.

Also, of course you don't. You weren't born into a time where it was as easy to do as flipping a light switch. Maybe they won't "be cool" with it. But it won't have any of the societal effects or mentally abusive effects anymore.. which is the real harm of such things. Do u think people will ever get used to realistic violence in movies? Lol. Of course we know they did..

I'm not saying this is the world I want, btw.. Just the world which will come. Those born to it adapt. When there are millions and millions of these videos, it will cease to have any real meaning anymore.

You not being able to believe it doesn't change that. It just means you will have a hard time adapting.

0

Spire_Citron t1_j73wkkv wrote

Things can be common experiences and still deeply upsetting. Women already experience a lot of sexual exploitation and so far it hasn't become something they're not bothered by.

1