What's new
The Brexit And Political discussion Forum

Brexit may have begun but it is not over, indeed it may never be finished.

Facial recognition software lands three innocent Black men in jail. When will enough be enough?

Brexiter

Active member
Nijeer Parks did the work. He served time in prison after being convicted on drug-related charges, but he changed his life and held down a job as a carpenter, he told CNN Business in an interview last month. His dedication however didn’t protect him from landing right back in jail, this time for a crime of which he was innocent. Parks, of Paterson, New Jersey, was arrested in February of 2019 on a slew of charges from using a fake ID and possession of marijuana to shoplifting and aggravated assault, CNN Business reported. He went to a local police station with documents in hand prepared to right what had to be some kind of error.

"Four or five minutes later as me and [the clerk] were talking, two other officers walk up and tell me to put my hands behind my back," Parks told CNN Business. "He's like, 'Put your hands behind your back. You're under arrest.'" As it turns out, facial recognition software had incorrectly led officers to Parks, and he isn’t alone.

YouTube Video


Parks is the third known Black man to be wrongfully accused of a crime based on facial recognition software, and all three have filed lawsuits against the police departments involved, The New York Times reported in January. Michael Oliver, of Detroit, was 25 when he was wrongfully accused of stealing a teacher's cellphone from his vehicle and throwing it in May of 2019, the Detroit Free Press reported. He was charged with felony larceny after facial recognition software led police to Oliver as a suspect. The teacher also identified Oliver in a photo lineup, the Detroit Free Press reported. Detroit police commissioner Evette Griffie questioned police officials about "proper checks and balances to make sure that what's supposed to happen is actually happening." "One mistake is too many," she said. And Oliver’s case is only the second known case of its kind.

Robert Williams, of Detroit, thought he was being pranked last January when Detroit police called him at his office at an automotive supply company to inform him he would be arrested. He learned it was no joke when a police car met him at his home and handcuffed him in front of his wife and two daughters in the first known case of a false positive from facial recognition software, the Times reported. Officers wouldn't tell him why he was being taken in and only showed him the words "felony warrant" and "larceny" on a piece of paper, according to the newspaper.

Amazon, IBM, and Microsoft announced they would halt or temporarily stop allowing law enforcement to use their facial recognition technology last June. Amazon implemented a one-year moratorium. “We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology, and in recent days, Congress appears ready to take on this challenge,” the company said. “We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested.”

Although the Detroit Police Department has changed its policy to only allow facial recognition software in violent crime cases, experts argue that is not enough. In the absence of federal guidance on how to use facial recognition software, FBI Deputy Assistant Director Kimberly Del Greco testified to Congress in 2017 that facial recognition "remains an investigative lead only" and "not to be considered a positive identification of a subject."

"The candidates must be further reviewed by specialized face examiners and/or the relevant investigators," she said. Problem is, police officials don't even tell defense attorneys that facial recognition software (FRS) was part of their investigative work, according to a report CNN obtained by the National Association of Criminal Defense Lawyers. "The first hurdle to an effective challenge is recognizing the cases in which FRS was used,” the association wrote in the report. “Because police use FRS exclusively as an investigative tool in the face identification context, the state might not disclose its use to the defense.”

Clare Garvie, a lawyer at Georgetown University’s Center on Privacy and Technology, wrote for the university in May of 2019 that "there are no rules when it comes to what images police can submit to face recognition algorithms to generate investigative leads," and even though they are less accurate, police use grainy, low-quality surveillance camera photos to find suspects. “The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs,” Garvie said.

Of Williams’ arrest she said: “I strongly suspect this is not the first case to misidentify someone to arrest them for a crime they didn’t commit. This is just the first time we know about it.”
 
Back
Top