What's new
The Brexit And Political discussion Forum

Brexit may have begun but it is not over, indeed it may never be finished.

'Dangerous snake oil': DNA phenotyping in Canada threatens to end in even more racial profiling

Brexiter

Active member
Yet another sketch of a Black man identified by a police department as a suspect in a violent crime went viral. This time, it wasn’t even a sketch based on a witness’ memory. In the horrific sexual assault case being investigated by the Edmonton Police Service in Canada, the victim was said to have gone unconscious in the 2019 attack.

The police force, therefore, relied on the process of DNA phenotyping or predicting what someone could look like using biochemical characteristics. But as geneticist Adam Rutherford and countless other commenters pointed out on social media, there’s a flaw in using the technology.

“You can’t make facial profiles or accurate pigmentation predictions from DNA, and this is dangerous snake oil,” Rutherford said in a tweet on Wednesday. His words were shared more than 11,000 times, and by Thursday morning, Edmonton Chief Operating Officer Enyinnah Okere was reversing his decision.

RELATED STORY: Facial recognition software lands three innocent Black men in jail. When will enough be enough?

Read Okere’s full statement:

“My name is Enyinnah Okere and as Chief Operating Officer for the Community Safety and Well-being Bureau of EPS, I am responsible for overseeing our sexual assault section - it was my team that put out a release two days ago about the unsolved sexual assault of a young woman in 2019.

This was a horrific sexual assault, one that very nearly caused the death of the young woman who was left unconscious and almost fully unclothed on a minus 27-degree morning in March.

It is the type of case from which a victim may never fully recover, made worse by the fact that after two years, she has not received justice. The violent nature of the assault, the fact that the victim lost consciousness, and that the suspect was wearing bulky winter clothes and a face mask, meant that we had very little to work with - only that the suspect was Black and about 5'4 with a black toque, pants, and sweater or hoodie and that he had an accent. In addition, we had no witnesses, no tips, no CCTV and, after two years, no leads.

To move this stalled case forward, our team members sought the advice of colleagues in other jurisdictions who had previously used DNA phenotyping and saw potential for it here. They commissioned a profile which we released on Tuesday.

I have nothing but respect for my team for being willing to try every conceivable tactic to bring this case to a satisfactory conclusion - the victim deserves nothing less. I want to thank our people who do everything they can to pursue this work with rigor. They are relentless and I will not ask them to be anything less.

But we were not and are not oblivious to the legitimate questions raised about the suitability of this type of technology. The potential that a visual profile can provide far too broad a characterization from within a racialized community and in this case, Edmonton's Black community, was not something I adequately considered. There is an important need to balance the potential investigative value of a practice with the all too real risks and unintended consequences to marginalized communities.

In our release, we did try to qualify the benefits and limits of the technique we used here. We felt we were clear on its limit. We indicated we saw it as a last resort. And we thank the media who attended our briefing for producing careful and balanced stories that similarly noted the intent of this work and the very fair criticisms that need to be considered.

Any time we use a new technology - especially one that does raise concerns about profiling of a marginalized group - we cannot be careful enough in how we validate these efforts and fully, transparently consider the risks.

We have heard legitimate external criticism and we have done our own gut checks internally to determine whether we got the balance right – and, as a leader, I don't think I did.

While the tension I felt over this was very real, I prioritized the investigation – which in this case involved the pursuit of justice for the victim, herself a member of a racialized community, over the potential harm to the Black community. This was not an acceptable trade-off and I apologize for this.

For this reason, EPS will be taking the following steps today.

We are going to remove the visuals provided with this release from our web site and will remove our social media images altogether, effective this morning.

We will be reviewing our internal processes to better ensure the appropriate, robust and stress-tested tools are in place to better inform our decisions on such matters going forward.

And we will continue to prioritize and explore every conceivable and appropriate means to find justice for the victim in this case - she deserves our continued efforts and focus, and we will not give up on our efforts for her.”

I don’t say this often enough about police officials who are doing their jobs honorably, or even those who are trying and make mistakes along the way, but I respect Okere’s explanation. He admitted fault, explained his steps to rectify the situation, and continued his commitment to achieving justice for the victim. When he realized the error of his ways—and social media users made it pretty difficult for him not to—he didn’t continue along a path that could have potentially netted more victims than justice.

There is a lesson in that that police departments in America could stand to learn from.

Nijeer Parks, a Black man who served his time in prison, was arrested again in 2019 due to facial recognition software wrongfully identifying him as a match for multiple crimes, from possession of marijuana to shoplifting and aggravated assault, in Paterson, New Jersey. The state of New Jersey has yet to put restrictions in place to protect its citizens from this technology. The New Jersey Attorney General sought public feedback in March.

YouTube Video


In response, the ACLU of New Jersey called for a total ban on the use of the technology by law enforcement. "Facial recognition technology, like many kinds of artificial intelligence or automated decision systems used by the government, can worsen racial inequity, limit our civil rights and liberties – like the right to privacy and freedom of speech – and deprive people of fundamental fairness," the organization said in a news release. "If we want to combat those harms in the Garden State, facial recognition cannot escape our scrutiny, especially as New Jersey law enforcement continues to use the technology despite its dangers."

Black Lives Matter Paterson released a similar statement on its Facebook page:

“Numerous studies point to the inherent biases programmed into technologies, where people of color have significantly higher chances of being misidentified and subsequently criminalized. More so, without proper forms of accountability, expanding the surveillance and policing apparatus in our communities will only lead to a broader misuse of power.”

House Democrats Ted Lieu, Sheila Jackson Lee, Yvette Clarke, and Jimmy Gomez proposed a facial recognition bill to "place strong limits and prohibitions on law enforcement use of facial recognition technology."

Jackson Lee said in a news release about the proposed legislation:

“Facial recognition technology must not be used as an invasive, intrusive surveillance tool because, if unrestrained, this powerful technology can be misused for racial profiling, infringing on personal privacy, and vilifying people who exercise their Constitutionally protected rights, such as the right to participate in peaceful protests.”
 
Back
Top