Getty Images/iStockphoto
1153374425
It was in the 1960 film Inherit the Wind that someone said, “Mister, you may conquer the air but the birds will lose their wonder and the clouds will smell of gasoline.” Though scripted some 60 years ago, those words are powerfully relevant today. How are we to temper the expanding world of technology in order to maintain the dignity of nature? Or more pessimistically, what of nature are we willing to sacrifice in order to maintain our current pace of technological advancement?
While I’m sure the screenwriters of Inherit the Wind were not thinking about artificial intelligence when they wrote that particular line, it came to mind when the Madison city council recently discussed a proposed ban on the purchase and use of facial recognition software.
In a decisive 17-2 vote Dec. 1, the council approved an ordinance that prohibits city agencies, their departments and divisions, from acquiring and using facial recognition technology or any information pulled from facial surveillance systems. The ban does include exceptions, allowing the use of the technology to identify and locate victims of human trafficking or children being sexually exploited or who are missing. Those exemptions were added during council debate after the Madison Police Department raised concerns about the ordinance. Interim Chief Vic Wahl explained that it might halt cases already in progress: “The main circumstance where we use it is dealing with child victims. Whether it’s human trafficking investigations or child pornography investigations.”
The ordinance impacts the police department the most and perhaps paves the way for more legislation that seeks to mediate technological advancement and effective governance.
So why has the ban come about now? There is an increased awareness around the country about how artificial intelligence can affect policing.
Some academics and politicians argue that surveillance through tech is galvanizing an even more refined and dangerous discrimination. And there seem to be few rules governing its use: The Center on Privacy and Technology at Georgetown Law published a comprehensive survey that found that “law enforcement face recognition is unregulated and in many instances out of control.”
Police use facial recognition to cross reference suspects’ photos to mugshots and driver’s licenses. And as of 2016, nearly half of American adults had photos within a facial recognition database used by law enforcement without their prior consent or knowledge. For scale, that is the biometric data of over 117 million Americans being used by law enforcement with little to no governmental oversight — and that is terrifying.
Admittedly, the debate around privacy is one with many nuances; and in today’s society, it might seem worthwhile to give up some personal freedoms for the sake of the greater good. “If I’ve done nothing wrong, I shouldn’t be worried that my picture is being run through an algorithm to bring justice to someone else” is a common argument.
However, the inherent problem with facial recognition software is the instability in the algorithm that makes it run. Debates on privacy aside, facial recognition technology holds significant racial bias that has been proven over the years in ongoing and dense scholarship.
Proponents argue that algorithms are more than 90 percent accurate; however, research shows divergent error rates across demographic groups. The National Institute of Standards and Technology found that facial recognition technologies across 189 algorithms are least accurate on women of color. Additionally, research conducted by MIT and Microsoft found that the technology performed the worst on darker-skinned females, as error rates were 34 percent higher than for lighter-skinned males across three different types of algorithms. In general the error rates are higher for women, Black people and those 18-30 years old.
For Ald. Max Prestigacomo, a co-sponsor of the ordinance, these findings were enough for him to vote for the ban. “In a time when we are discussing the role of policing and its intersection with racial bias, it makes no sense to further entrench our law enforcement systems in processes that further racial discrimination,” he told me over the phone.
And he is right; the differing error rates across demographic groups means there is a higher risk of misidentifying people of color for crimes they did not commit. And in a world where we have seen Black people criminalized at disproportionate rates, we cannot run the risk of perpetuating racial bias in law enforcement through technology. Human error is one thing, for it to be compounded by a controversial technology is another.
There are benefits to facial recognition technology, as the exceptions to the ban show. However, it must not be used to disenfranchise those who are already overly targeted by law enforcement. Other cities have recently passed similar bans, including Portland, Boston, Oakland and San Francisco. Until Silicon Valley can develop recognition technology that is not inherently discriminatory, municipalities across the country should follow their lead and prohibit the use of facial recognition technology by their city agencies.
Nada Elmikashfi recently ran for state Senate and is chief of staff to state Rep.-elect Francesca Hong.