Facial Recognition: The Technological Innovation Turned Racial Profiling Weapon

JHTL’s mission, as written in our constitution, is to promote the education, research, and publication of ideas and issues regarding technology.  Given the inextricable linkage between technology and social justice, it is beyond time that we use our status as an honor board to provide a platform for discussion surrounding the continuous inequalities that remain persistent in our communities.  This Black Lives Matter blog series seeks to highlight the deliberate disparate treatment of Black people and to provide a space for discourse surrounding technology and social justice issues

By Michaela Hinckley-GordonContent Editor

Facial recognition technology is especially vulnerable to the negative effects of implicit bias because it utilizes bias information collected by humans to identify an individual based off the individual’s facial characteristics matching an image. The incapacity for humans to be neutral in how we perceive the world, coupled with racial stereotypes, creates the possibility that the identification matches produced by facial recognition technology are erroneous. This possibility turned into the reality for Robert Williams. Robert Williams is a Black man who was wrongfully arrested because of a false face recognition match.[1]

In January 2020, Williams was arrested in front of his wife and two young daughters at his home in Farmington Hills, Michigan. Williams was transferred to the Detroit Detention Center, where he was patted down approximately seven times and forced to spend 18 hours in a dirty cell. The following morning, two police officers showed him a blurry surveillance camera photo of a Black man and asked if it was him. Upon denying that the man in the photo was him, Williams heard one of the officers say “the computer must have gotten it wrong.”[2]  Hours later Williams was released without any explanation of the process which led to him being identified as the suspect.

Although Williams’s case is the first reported incident of someone being wrongfully arrested based off a false positive face recognition match, such an occurrence was foreseeable. A study by the National Institute of Standards and Technology evaluated the accuracy of face recognition algorithms based on demographics and found that the rate for false matches within various facial recognition software programs is much higher for people of color.[3] The study found that false positive rates were significantly higher for Asian, African American, Native American, American Indian, Alaskan Indian, and Pacific Islander populations compared to Caucasians.[4] More specifically, false positives occur at an alarmingly higher frequency when matching Black women.[5]

The potential harms of false positive matches, as shown in Robert Williams’s case, include infringement on an individual’s privacy, civil rights and civil liberties. False facial recognition matches could lead to additional questioning by law enforcement, surveillance, errors in benefit adjudication, or loss of liberty.[6] The urgency in recognizing the harms of higher rates of false positive face recognition matches for people of color is evident when considered with the fact that Black people in the United States are not only arrested at much higher rates, but also killed by police more than twice the rate of white people.[7] Several tech companies have realized the harms of law enforcement relying on facial recognition technology.

In June, IBM, Amazon and Microsoft all announced that they would stop selling facial recognition technology to law enforcement.[8] Now it is time for the law to catch up. A group of senators recently introduced a bill in the Senate to ban law enforcement’s use of facial recognition technology. If enacted, the Facial Recognition and Biometric Technology Moratorium Act of 2020 (“Facial Recognition Act”), would prohibit biometric surveillance by the Federal Government and withhold certain federal public safety grants to states and local governments that conduct biometric surveillance.[9] The Facial Recognition Act defines biometric surveillance as “any computer software that performs facial recognition or other remote biometric recognition in real time or on a recording or photograph.”[10]

The Facial Recognition Act should be enacted in order to prevent further proliferation of racial profiling by law enforcement. Implicit racial bias within the justice system is a systemic result of institutional racism upheld by centuries of criminalizing people of color. Facial recognition technology should not be used in conducting a police investigation because the racial biases woven into the technology disparately impact people of color. In order to achieve true justice, we must do more to regulate our own implicit bias in order to ensure technological innovations do not become a policing tool used to further disparage people of color.

 

[1] See Man Wrongfully Arrested Because Face Recognition Can’t Tell Black People Apart, ACLU (June 24, 2020), archived at https://perma.cc/GUF5-RQUM (explaining that Williams was arrested on suspicion of stealing watches from a store in Detroit, Michigan). Detroit police sent an image of the suspect captured by the shop’s surveillance camera to Michigan State Police.  Id.  The image was then run through its database of driver’s licenses.  Id.  The image was then analyzed by facial recognition software purchased by the Department from a private data brokerage company.  Id.  The software falsely identified Robert Williams as the suspect pictured in the photo.  Id.

[2] See Robert Williams, I was wrongfully arrested because of facial recognition. Why are police allowed to use it? The Washington Post (June 24, 2020), archived at  https://perma.cc/WYD7-S9JQ (detailing Williams account of his wrongful arrest).

[3] See Facial Recognition Technology (FRT), Nat’l Institute of Standards & Tech. (Feb.6, 2020), archived at https://perma.cc/8YAL-V3WC (reporting the accuracy of face matching technology “for demographic groups defined by sex, age, and race or country of birth”).

[4] See id. (noting that “false positives rates often vary by factors of 10 to beyond 100 times”).

[5] See id. See also Kade Crockford, How is Face Recognition Surveillance Technology Racist? ACLU (June 16, 2020), archived at https://perma.cc/5NDV-GBM7 (noting that facial recognition algorithms misclassified Black women 35% of the time).

[6] See id. (listing the potential harms resulting from false positive facial recognition matches).

[7] See Malkia Devich-Cyril, Defund Facial Recognition, The Atlantic (July 5, 2020), archived at https://perma.cc/PYY5-CXR8 (highlighting the severity of police brutality against people of color).  See also Kade Crockford, How is Face Recognition Surveillance Technology Racist? ACLU (June 16, 2020), archived at https://perma.cc/5NDV-GBM7 (emphasizing that Black people already face a higher rate of arrest which will likely be exacerbated by false positive facial recognition matches).

[8] See id. (stressing that facial recognition technology is a tool which drives police brutality by enforcing racial stereotypes of Back criminality).

[9] See Facial Recognition and Biometric Technology Moratorium Act of 2020, S. 4084, 116th Cong. Intro. (2020) (introducing the primary goal of the Facial Recognition Act).

[10] See id. at § 1 (defining key terms of the Facial Recognition Act).

[A]n automated or semi-automated process — (A) assists in identifying an individual. Capturing information about an individual, or otherwise generating or assisting in generation surveillance information about an individual based on the physical characteristics of the individual’s face; or (B) logs characteristics of an individual’s face, head, or body to infer emotion, associations, activities, of the location of an individual.

Id. at § 3

 

Student Bio: Michaela is a third-year student at Suffolk University Law School and a Content Editor of the Journal of High Technology Law. She is also a student attorney for the Suffolk Law Human Rights and Indigenous People’s Rights Clinic and the Vice President of the Suffolk Student Peace and Reconciliation Coalition.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

 

 

Print Friendly, PDF & Email