Viewing a single comment thread. View all comments

Ragondux t1_j6xxdwg wrote

There can be temporary protections or ways to detect deepfakes, but ultimately that battle will be lost. Tools like StyleGAN can produce faces of people that don't exist, so even if there is no photograph of you anywhere online, it can produce your face or something close to it.

I'd say our only hope is that we will lose interest in sex tapes when we're at the point we could assume they're all fake anyway.

Another way would be to use something like a Blockchain to prove videos are real. You wouldn't be able to prove when they're fake, but you could claim they're fake if they're not in the Blockchain at the appropriate time.

1

Spire_Citron t1_j70r8qy wrote

The thing is that the possibility that the sex tapes are fake won't make people lose interest because there were people who were paying for those fakes to be made. They don't care that it's fake, they just want custom porn of a particular person.

2

Ragondux t1_j70t26x wrote

People will still want them, but hopefully they won't destroy someone's life.

2

Spire_Citron t1_j70vb88 wrote

It's not just people believing they're real that causes harm, though. Having pornography made of you against your will victimises people in and of itself even if everyone knows it's not real. Everyone knew the deep fake porn videos in the latest incident weren't real, but it was still deeply upsetting for the women whose likenesses were used to find out it had been going on.

2

Mrkvitko t1_j6zrg7t wrote

There's no way to prove video is real with something like blockchain or anything like that. In absolutely worst case, you can always emulate sensor in a real camera...

1

Ragondux t1_j70tjs4 wrote

Sure, there will be a need to trust the source, but you can still have some level of proof. Imagine that some politician says something horrible on camera. Then he decides to claim it's a deepfake. Using a blockchain you could prove that the video existed at a given date and had been claimed true by MSNBC, for example.

It's not great but I think it's the best we can do in the near future.

1

Mrkvitko t1_j710hws wrote

Which will give you nothing. It won't prove the video is a deepfake or not. Not to mention you're unlikely to do the verification by yourselves (generally, each time video is uploaded somewhere it is recompressed which changes its checksum). So you're relying on some "trustworthy institution" anyways.

In that case, you can drop the blockchain and just check what the institution says.

1