Facial Recognition: The Worst AI Has To Offer
Technology Policy Brief #107 | By: Mindy Spatt | January 24, 2024
Photo taken from: www.amnesty.org
__________________________________
Facial Recognition Technology uses software to evaluate similarities between face images. Computer generated filters transform images into numbers and symbols that cam be compared. It was what enables you to sign into your phone by just looking at it, and is also used to identify an individual in a database of photos or determine the similarity between two facial images
There’s no question that Facial Recognition Technology reflects and even enhances racial bias. Still, Vermont is the only state in the U.S. that has instituted a near total ban on it and only a handful of states limit its use. The technology’s earliest and most criticized applications have been in law enforcement, but other uses may be in the offing. The General Services Administration is carrying out a study to test potential racial bias in facial recognition technology systems that it is considering using as identifiers for accessing federal benefits.
Policy Analysis
The GSA’s announcement, in August 2023, describes current methods of facial recognition identity verification as “often inequitable.” These inequities have been documented by numerous advocates, academics, and critics. In an article in Scientific America titled ‘Police Facial Recognition Technology Can’t Tell Black People Apart’ in May of the same year, authors Thaddeus and Natasha Johnson said their research confirmed that inequities in policing can actually be exacerbated by the use of facial recognition, and concluded that law enforcement agencies that deploy automated facial recognition technologies over-arrest Black people. “We believe this results from factors that include the lack of Black faces in the algorithms’ training data sets, a belief that these programs are infallible and a tendency of officers’ own biases to magnify these issues.” The criticisms aren’t new, but, the authors note that despite efforts to improve the bias in facial recognition algorithms, it still mostly fails at identifying anyone other than white men.
Reading this, one might wonder: why bother? Shouldn’t the GSA and law enforcement be seeking technologies that promote rather than deny equity? How useful will it be in other fields, like healthcare? The Johnson’s research indicates that a bigger data set alone won’t cure the problem, since the ways facial recognition interacts with other racist policies and biases are the problem.
Amnesty International has criticized the way geographic racism and facial recognition technology interact, calling the technology “digital stop and frisk”. In a study of New York City, Amnesty found the that boroughs that were subject to high rates of discriminatory stop-and-frisks by law enforcement are the same areas where facial recognition is being most heavily deployed. Amnesty also correlated higher proportions of people of color with higher numbers of CCTV cameras with facial recognition components.
Gideon Christian, PhD, is a Canadian researcher who studies the racial and gender impacts of facial recognition technology. According to Christian, “There is this false notion that technology unlike humans is not biased.” In fact, he said, “technology has been shown (to) have the capacity to replicate human bias. Christian, whose research is funded by the Canadian government, found wildly disparate rates in facial recognition accuracy between white men- 99%, and black women-35%.
While states have been slow to take action, many cities have limited or banned the use of facial recognition by law enforcement including Portland (Both Maine and Oregon), San Francisco and Boston, New York city has yet to do so. Amnesty’s Matt Mahmoudi, an Artificial Intelligence and Human Rights Researcher, concluded that “Banning facial recognition for mass surveillance is a much-needed first step towards dismantling racist policing,” and Amnesty is urging New York to join the ban bandwagon.
Engagement Resources
- How Black Americans view the use of face recognition technology by police, By Emily A. Vogels and Andrew Perrin.
- Ban Facial Recognition, an interactive map that shows where facial recognition surveillance is happening, where it’s spreading to next, and where there are local and state efforts to rein it in.
- Petition: Halt Dangerous Face Recognition Technologies.
Check out usrenewnews.org/AI for more news on AI technologies and trends. Get the latest updates from our reporters by subscribing to the U.S. Resist Democracy Weekly Newsletter, and please consider contributing to Keeping Democracy Alive by donating today! We depend on support from readers like you to aide in protecting fearless independent journalism.
Hi my family member I want to say that this post is awesome nice written and come with approximately all significant infos I would like to peer extra posts like this