Comments

You must log in or register to comment.

JwSatan t1_jdy9mqx wrote

Police should be required to get a warrant to use Clearview

416

danger_davis t1_jdzj6yf wrote

Clearview is willingly giving police access to the photos. Warrants would be for a situation where the person possessing the photos doesn't want police to have it.

44

firehawk1115 t1_jdzrn6k wrote

I guess it depends on where Clearview gets the photos. If they scrape from the internet and someone didn't 100% give them permission they probably wouldnt want Clearview to have them and by extension the police.

34

DigitalArbitrage t1_jdzvvll wrote

They were the lowest bidder on a government contract to provide face scanning to sign in for filing federal taxes. Basically every adult in the U.S. would have been required to give them face scans if civil liberty groups hadn't pushed back on it.

For decades police in some states (e.g. Florida) have also been using databases of driver's license photos to search for fugitives.

32

worldofzero t1_je09byu wrote

Clearview scrapes online info and will not let you remove yourself from their database without giving them your gov I'd. If you or your friends use social media and share pictures your in your database and there is nothing you can do about it. They're ethically a black hole.

13

Iohet t1_je0lbwo wrote

If they're publicly accessible on the internet, I don't think it really matters. You don't need a warrant to look at someone's public Facebook, but you do to get to their private messages.

8

bshepp t1_je13eqk wrote

Posting something that is accessible to the public doesn't automatically make it public and posting something publicly doesn't necessarily give anyone the right to copy it, use it, or make derivative works from it.

4

Iohet t1_je1ycpv wrote

If you're trying to argue that you should be able to issue a DMCA Takedown, then by all means try that route, but as far as complying with the 4th amendment, public is public, and if you give something to an intermediary, the 4th amendment rights are the intermediary's rights, not your rights.

7

bshepp t1_je2187x wrote

If you leave your house unlocked does that mean your house is now a public space? That is the point I am trying to make. Uploading something to the web does not necessarily make it public. It's entirely possible they used entirely public resources in this instance. Again I'm just saying accessable to the public doesn't necessarily mean it is in the public.

4

Iohet t1_je2210o wrote

Yea, but the door is open. Plain view doctrine. As far as Facebook goes, it's based on reasonable expectation of privacy. The courts have found that private messages have a reasonable expectation of privacy, but not public posts. They can't get into items locked behind your user in your account without a warrant, but publicly posted things are fair game

4

Plenty_Branch_516 t1_je238ip wrote

Yeah, you're right. Law is clear here. Is it just? Probably not, but law strays often from such course.

1

Iohet t1_je27ow8 wrote

I'm not even sure it's not just. If you publicly post something that ties you to an illegal activity, that's on you. If you privately post something, you're afforded some level of privacy, but, again, once you give it to Facebook, they're the ones served the warrant, not you, and they don't give a shit about you enough to fight it. So, really, just don't do it.

Absolutely push your legislators to ban this type of data collection, but, in absence of that, just because new methods of accessing old public data are better doesn't mean the concept is no longer just, and one should be aware of that before they say anything that could hurt them

3

bananafobe t1_je3bchy wrote

>If you publicly post something that ties you to an illegal activity, that's on you.

I think this is part of the issue. It's not necessarily posting photos of yourself committing crimes, but rather a potentially flawed program using a database of unrelated photos to link you to a crime that you may have had nothing to do with.

1

Plenty_Branch_516 t1_je29c0e wrote

"Just" short for justice is a finicky concept. As technology improves what can be considered the boundary for privacy begins to falter.

In a photo someone else posted? Your privacy is gone. Connect to public wifi? Your privacy is gone. Ping off a cell tower? Your privacy is gone. One of your relatives use ancestry.com? They can trace you to any DNA found even if you aren't in a criminal database.

Point being, you are not in control of your privacy. Far from it.

One doesn't need to give permission to join the pool of public data. In fact it requires significant effort to avoid it and in rare cases retract information added to it. A right to privacy is usually considered just. However, as our technology improves and it becomes easier to link disparate sources of data together, that right to privacy is eroded. Potentially even sold for a pitiful sum.

Don't get me wrong though. It's all a net good. For most of us, privacy is an incredibly cheap price to pay for the boons we are getting in science and technology.

0

JwSatan t1_je08b77 wrote

I did not give the police approval to use my photos

>The right of the people to be secure in their persons, houses, papers, and effects,[a] against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.[2]

10

__Arty__ t1_je0m915 wrote

You gave permission to whatever app you posted the photos to.

12

Socialistpiggy t1_je0pooa wrote

If your photos are publicly available, or you give them to someone else (Facebook, Instagram, etc) you no longer have a privacy interest in them. When you willingly give something to someone else, it is no longer yours. You can have an agreement (civil contract) that you expect that company to keep things private, but they can still voluntarily give them to the police. Your remedy would be to civilly sue them for breach of your contract, but said photos can still be used against you.

4

danger_davis t1_je0y5yy wrote

You don't have to once your photos are publicly accessible. If the photos are in your home and not publicly available on the web then the 4th amendment would be applicable.

3

bananafobe t1_je3ahll wrote

I don't think the issue is necessarily that they're not allowed to provide access (though, we shouldn't just assume they are), but that if they are allowed, whether that is something we want as citizens, and whether they have an obligation to allow us to opt out, at the very least.

1

danger_davis t1_je3bvib wrote

The problem is that I want criminals to be caught but I don't want a police state hampering freedoms. Traffic cameras and pictures taken from open internet sources don't bother me. If they were snooping into my Google drive or Apple account without a warrant that is where I would be livid. But if I create a Facebook page with my face on the front photo I am not going to be upset when the government is able to see that photo. There is no expectation of privacy there.

2

ytaqebidg t1_jdzeii3 wrote

My favorite part:

"There are a handful of documented cases of mistaken identity using facial recognition by the police. However, the lack of data and transparency around police use means the true figure is likely far higher."

289

Artanthos t1_je31qmn wrote

The important part is, is it more accurate than human witnesses identifying suspects from photos.

Pretty certain this software is going to be more accurate. Human identification has always had a substantial error rate.

3

bananafobe t1_je39ig8 wrote

At the same time, that perceived accuracy can mean a false positive is less likely to be questioned, compared to an eye witness whose testimony can be interrogated.

A defense attorney asking a jury to consider whether a witness's recollection seems trustworthy can appeal to a juror's experience with their own memory being unreliable. A defense attorney trying to explain a statistical probability resulting from AI coding has an uphill battle, given how many of us basically assume computers are magic.

21

Artanthos t1_je5phv5 wrote

No photograph is blindly accepted. A lot of human eyes will be on both the images and the person between them being called out as a suspect and conviction.

That includes the defense attorney.

2

bananafobe t1_je64ry9 wrote

And when the AI generates a face from a partially obscured or low-resolution photograph, and presents that as a scientifically accurate representation with 99.9% validity in clinical studies (or whatever), how easy is it going to be for the average public defender to explain to a jury that what they're seeing is basically a computer drawing, even though it looks like a photograph, and that 99.9% actually refers to a statistical probability about some obscure metric, and not that it's 99.9% likely that this is the right person?

1

Artanthos t1_je67as6 wrote

That’s not how facial recognition works, and it’s not how the technology is used.

All this does is compare images from a camera connected to a crime with a database of publicly accessible photos. When it finds matches, it provides the match locations, e.g. Facebook.

Police investigators then use those leads to identify potential suspects.

You still have the rest of the investigation, and human eyes on the images and the potential suspects.

2

Rstrofdth t1_je51lvh wrote

Then why won't the CEO testify in court to it's accuracy?

"Mr Ton-That told the BBC he does not want to testify in court to its accuracy. "

5

Artanthos t1_je5o19d wrote

Very few people are going to willingly testify in court unless they are bringing an ax to grind.

If the court had a real reason for his testimony, it could compel his appearance.

1

PicklerOfTheSwamp t1_jdy4fde wrote

Wtf is this, some Minority Report shit?!?!

250

jetbag513 t1_jdy4n28 wrote

Seem more like the Majority at this point.

103

fiftyeleventimes t1_jdybm2y wrote

Sadly true.

21

supercyberlurker t1_jdyktnc wrote

We all fear what's coming.

That thing where there's cameras everywhere, hooked up to machine learning systems like hall monitors, microticketing us into compete compliance with every regulation everywhere we go, digitizing our every public moment.

What's worse is, there won't even be the three seashells.

68

gothfreak90 t1_jdyltyo wrote

Can we at least have rat burgers and beer?

14

StruggleSnuggled t1_jdz1bf5 wrote

Rat burgers and beer or Taco Bell.

5

Magatha_Grimtotem t1_jdz5zys wrote

I'll take the rat burger. At least you know what's in that.

7

Same_Lengthiness9413 t1_jdzk7lp wrote

Having eaten rat, it’s not that bad… could always be worse. Could of been no rat…

1

mobileagnes t1_je0jr6v wrote

China already had this in a way with their social credit score system. 😱

2

empfindsamkeit t1_je1l1q4 wrote

You know we have control over all those regulations right? If they're law and it's okay for a relative few to be randomly caught and published, it should be okay for 100% to be punished, or it should've never been law in the first place. And the proper avenue is repeal rather than just trying to ensure most laws aren't enforced most of the time.

1

Indercarnive t1_je4s1o0 wrote

No no, I'm pretty sure this stuff is mostly being used to make reports on minorities.

2

RonBourbondi t1_jdzsrvn wrote

Sure your tune would change if this helped identify someone that kidnapped a child.

−37

Mobely t1_jdzvrtd wrote

And if it identified the wrong person? Who could not produce the child they don’t have? And are sentenced to death form presumably murdering said child since the child is nowhere to be found?

32

An_best_seller t1_je0dfi6 wrote

Trigger Warning: Mass-shooting, rape.

I think that this technology shouldn't be used as evidence, but just as a tool to find potential (but not definitive) criminals/perpetrators and to find suspects way faster. However, the person should only be sentenced by a judge if they find evidence of the crime that is not based on their face.

Here are some examples:

  • There is a mass-shooting. A camera films a video of the mass-shooter face. Police don't know where nor who the mass-shooter is. Police uses Artificial Intelligence to find people that have a similar face to the mass-shooter. They find 7 people with the same face in the USA. They start investigating each person and they find that 1 of the 7 face-suspects bought a gun with the same type of bullets that the ones found in the crime scene. They also find that this one suspect has shoes that match the shape of the blood footprints of the crime scene. And they find that the suspect had been searching in Google Maps the location of the crime scene before the mass-shooting happened. They arrest this one suspect and keep looking for more evidence and they finally go to trial, and the overwhelming evidence tells they are guilty, so they sentence them to life in prison or death penalty (I'm not going to argue right now whether the death penalty is wrong or right. That's off-topic).
  • A woman is raped by a man. A camara from a bar films the video of the rapist face. Police don't know where nor who the rapist is. Police uses Artificial Intelligence to find people that have a similar face to the rapist. They find 9 people with the same face in the USA. They take DNA samples from each of these 9 people and compare them to the DNA from the semen found in the victim's body. It happens that 1 of these 9 people have the exact same DNA. They start investigating this man and find that his friends were in the same bar of the crime scene and the same day of the crime. His friends tell police that that they were with the current suspect at that bar on that day. The man goes to trial, more evidence of the crime is found and he is sentenced.

As you can see, I don't support using Artificial Intelligence as definitive evidence to sentence someone to prison time nor death penalty. But I think that it can make the process of finding possible/potential criminals much easier and much faster, and then allow police to start looking for evidence in one of each suspects. If police doesn't find evidence in one, multiple or all of the suspects, they should let them go. A suspect should only be sentenced if they find more evidence than the one from the Artificial Intelligence research. Of course, when I say that they should be sentenced if they find "more evidene" of the suspect, I mean solid and important evidence. I don't mean evidence such as "The suspect lives in the same city as the victim, therefore they are guilty". I mean high-quality evidence.

By the way, I don't know too much about crimes nor types of evidence nor the protocols of the police, so take what I say with a grain of salt. I'm just guessing how the process could be like.

−2

RonBourbondi t1_jdzwb2z wrote

So don't use all your tools before they kill the child?

−10

Mobely t1_je038qm wrote

Well if we're going down that road, almost all child kidnappings are done by the kids other parent and it's not to kill them.

So if we are looking to stop all child kidnappings that result in the child's murder, we would have an insanely high false positive result. You'd be jailing thousands of people, leaving their kids orphaned and vulnerable to violence. So yeah, don't use the shitty tools to cause more harm than good.

FP/N=FP/FP+TN

12

RonBourbondi t1_je065xw wrote

Nah because this goes off of pictures to identify themselves.

Not only that cops post pictures of suspects on news all the time. Thus AI is no different than crowdsurfing except it is more accurate and better.

−10

Joe-Schmeaux t1_je0j9l0 wrote

So trust the police with even more powerful tools?

4

RonBourbondi t1_je0x8jl wrote

So hinder an investigation and have a child not be saved from murder?

−1

Joe-Schmeaux t1_je15p0r wrote

So trade one set of murders for another?

2

RonBourbondi t1_je16edc wrote

Who's getting murdered from this?

0

Joe-Schmeaux t1_je170nr wrote

From the police misusing identification software and apprehending innocent people who end up in prison? Any such person would be at risk for murder or suicide. It's a shitty situation as is, let's not add things that can make it worse and give the already powerful, corrupt police forces of the world even more power. Trusting them to not misuse this can make things even worse, and we'll still have people being kidnapped.

3

RonBourbondi t1_je17iu7 wrote

So has there been a single case of them using the AI software and this happened?

−1

Joe-Schmeaux t1_je18w64 wrote

I just googled it and this was the first article to come up. He spent ten days in jail and $5000 in legal defense. This was three years ago. He may not have suffered physical harm, but the potential for misuse and abuse of this kind of power is concerning.

3

usalsfyre t1_je37ndv wrote

You’re not supposed to deep throat the boot….

1

Caster-Hammer t1_je04ib5 wrote

Let's play "find the fascist."

11

RonBourbondi t1_je069c7 wrote

So you're against cops posting pictures of criminals on the news in a search for tips of who they are?

−4

Caster-Hammer t1_je1zs0p wrote

So you're for moving the goalposts to defend an encroaching police state?

3

RonBourbondi t1_je20tvg wrote

If you want to call it that go ahead.

Nothing wrong using tools to track down criminals.

1

piTehT_tsuJ t1_je0162o wrote

It would be great if indeed it did the problem being facial recognition isn't anywhere near 100% accurate and this could lead to false arrests and the real kidnapper getting away.

7

RonBourbondi t1_je01dk0 wrote

Yeah I will gladly take a false arrest that can easily be cleared up over a dead child.

−3

Antnee83 t1_je0brsg wrote

"easily cleared up"

Tell me you have no experience with this without telling me...

Have fun getting a job when every background check shows "ARRESTED FOR KIDNAPPING A CHILD." Good fucking luck "clearing that up easily"

12

RonBourbondi t1_je0c8i6 wrote

So are you also against releasing a suspects pictures on the news which can lead to tips and visits from the police even on incorrect identifications?

Because this is no different.

1

TogepiMain t1_je0e2zc wrote

I sure am! You know how many lives are ruined by being thrown up on the "suspect" wall? No one cares that they didn't do it, all that matters is that their photo was in the news with the words "probably did a crime??" Underneath

7

RonBourbondi t1_je0etka wrote

Yet countless people have been caught this way and has saved lives from posting pictures of the actual perpetrators to get a crowd source answer of who it is.

0

piTehT_tsuJ t1_je0ky6t wrote

Crowd source? Like Reddits hunt for the Boston bombers...

5

TogepiMain t1_je0fd8k wrote

Gonna need some actual numbers on that or else who can say.

Theyre probably not worth the damage, long term.

4

Paizzu t1_je0xlzu wrote

> "Think of the children" (also "What about the children?") is a cliché that evolved into a rhetorical tactic.

https://en.m.wikipedia.org/wiki/Think_of_the_children

4

RonBourbondi t1_je0z8a0 wrote

Think of the kidnapped then. Lol.

If you have footage and pictures of the perp run them in an AI database to help narrow down suspects to then save lives.

Not particularly controversial.

−1

Treadcc t1_jdz2vi3 wrote

The problem is the dumb cops who are bad at their job will use this as a crutch to skip steps and build cases around the wrong people just like they have done before. So as much as I'd like to have better tools to catch criminals our unchecked and bad police processes will cause innocent people to get swept up.

168

Art-Zuron t1_jdzvypi wrote

AI is very susceptible to biases, so I'm concerned that police will turn it into a racial profiler as well.

77

MhaloTov t1_je001ag wrote

I’m sure you won’t be surprised to hear that exact thing has been done before in the past

42

Art-Zuron t1_je01a0j wrote

Amazon's hiring algorithm once had to be turned off becsuse it became racist and sexist. It was automatically excluding non white and female sounding names.

Iirc, it was because the majority of applicants were white males. As a result, most of the hires were white males. The AI recognized this as meaning that white males were the ideal candidate and would hire them more, which caused a feedback loop.

55

PMme10DollarPSNcode t1_je0anus wrote

There's also Google's infamous "Gorilla" Blunder: https://www.google.com/amp/s/www.bbc.com/news/technology-33347866.amp

24

Artanthos t1_je32oli wrote

8 years is an eternity in AI development.

Comparing a modern AI to something from that long ago would be the equivalent of taking someone with a masters degree and judging them by the work they did in kindergarten.

1

Prcrstntr t1_je0ygz0 wrote

The gorilla problem is not a simple one to solve.

−5

RevengencerAlf t1_je0ep4l wrote

>The problem is the dumb cops who are bad at their job

You say that like there are cops this doesn't describe...

11

Treadcc t1_je1onpd wrote

It's like "a few good apples" continues to be the excuse in America to do nothing to improve our policing

6

No-Significance2113 t1_je17ni8 wrote

It's why the law is "innocent until proven guilty" because it can be so easy to convict an innocent person that the laws supposed to be biased towards letting people go. God knows how many innocent people have been thrown in jail it must be atrouchise.

5

Artanthos t1_je32amj wrote

Cops cannot visually identify everyone. No human can.

Before this, all the cops had to identify a picture from a camera was files full of mug shots. This would present several problems addressed by facial recognition.

​

  1. Human identification from mug shots is notoriously error prone
  2. It only work if there are mug shots or other evidence that identifies the individual.
  3. Fingerprints and DNA are only going to be available if the person was previously arrested. The same problem that mug shots present.
−2

Mikethebest78 t1_jdylyv0 wrote

And if that doesn't frighten the hell out of you it should.

160

akurra_dev t1_jdz13nm wrote

It should frighten everyone, and yet like 30% of Americans are clamoring for this shit. It's so ironic how Republican voters claim to want small government yet excitedly vote for a police state.

79

Reidroshdy t1_jdze4by wrote

They want small government for them, and the people like them. If you aren't like them,they want the government monitoring everything you do.

57

akurra_dev t1_jdzgd9h wrote

Yes I know, but the irony is they end up getting shit on too because their fucking dumb asses don't realize it's actually rich vs. poor lol.

21

krichuvisz t1_jdzg0mr wrote

They shout freedom but put millions in private jails. They don't live in cities but like to destroy the environment, They claim to be pro live but destroy lives wherever they can, asf. The whole conservative movement is a huge paradoxon, racing towards Armageddon.

16

MhaloTov t1_je0068q wrote

I remember freshman year in college writing a term paper about this kind of stuff, really scares you when you look closer into it

2

Ragnr99 t1_jdzvdz6 wrote

I’m not worried. Surveil me and my boring ass life.

−16

[deleted] t1_jdy9zir wrote

Probably digging up dirt on ex-wives.

95

canzicrans t1_je1dpwv wrote

The staff and contractors at the NSA were spying on people they knew/wanted to know, why wouldn't the police?

5

Watcher0363 t1_jdyq95z wrote

Coming to your local PD soon. I know what you did last minute.

39

ArugulaZ t1_jdyw8b8 wrote

To make up photographic evidence...?

11

kstinfo t1_jdz7djk wrote

Fact:

If we lock everyone up there will be no more crime.

6

Slinghshots t1_je09ese wrote

Question: Couldn't an AI just generate a couple million faces to throw off this facial recognition system?

4

[deleted] t1_je2dwxy wrote

Yes. AI could actually reverse the most positive of the internet (freedom of info) by filling it with bullshit and giving it bulletproof SEO.

2

tallbartender t1_je39oij wrote

From the article "Clearview allows the customer to upload a photo of a face and find matches in a database of billions of images it has collected.

It then provides links to where matching images appear online. It is considered one of the most powerful and accurate facial recognition companies in the world."

For people (like me) who don't want to read the article, but just want to know what's going on.

3

GlocalBridge t1_je3tycn wrote

How is this different from what China is doing? Where are our privacy laws?

3

neutralityparty t1_je3megk wrote

The case should automatically be dismissed if AI is involved. AI violates 6th amendment

2

DazedinDenver t1_je1d2x5 wrote

Perhaps we should all declare that our images in the Clearview database are obscene in our judgment and must be removed immediately. Hey, if Michelangelo's "David" is obscene, then certainly any image of my ugly mug must qualify. And since we're apparently now enabling the tyranny of the minority in this country we might as well take advantage of that. And no, to stifle any comments to the contrary, I do not have a dick nose or scrotum lips...

1

[deleted] t1_jdyqes3 wrote

[removed]

−98

akurra_dev t1_jdz1duu wrote

Whoever is paying you to leave dystopian boot licker comments like this should fire you, you are so painfully obvious that it's comical.

39