facial recognition
Lamaya Robinson was misidentified by facial recognition tech (Screengrab/Fox 2 Detroit)
Listen to this article here
facial recognition
Lamaya Robinson was misidentified by facial recognition tech (Screengrab/Fox 2 Detroit)

A Black teenager near Detroit was removed from a skating rink after facial recognition software wrongly identified her. Lamya Robinson was dropped off for an afternoon with friends when she was scanned by skating rink employees and informed she could not enter the facility.

The young Black teen had been misidentified as someone who had gotten into a fight at the rink earlier in the year, and was banned from the property. Lamya Robinson, however, had never been to the skating rink before. 

Now her family is considering legal action against Riverside Arena Skating rink, located in Livonia, Michigan. “To me, it’s basically racial profiling,” said Lamya’s mother Juliea Robinson. “You’re just saying every young Black, brown girl with glasses fits the profile and that’s not right.”

Facial recognition software inaccurate

This is not the first time facial recognition software has wrongly identified a Black person or other person of Color as a troublemaker, or even a criminal. Facial recognition software, which is an unregulated service often employed by law enforcement agencies, is highly inaccurate when distinguishing features for people of Color, creating serious — and potentially deadly — issues when used on Black and Brown faces. 

Facial recognition software has erroneously identified Oprah Winfrey as male, and targeted an east coast University student as someone who took part in a bombing in Sri Lanka, which led to the student receiving death threats. 

Facial recognition technology is particularly dangerous when used against Black women. An MIT study of three commercial gender-recognition systems found they had error rates of up to 34% for dark-skinned women — a rate nearly 49 times that for White men

Getting it wrong can be deadly for Black people

Meanwhile, the Robinson family wants answers. “You all put my daughter out of the establishment by herself, not knowing what could have happened,” said Lamya’s father, Derrick Robinson, of the skating rink. 

While the Riverside Arena Skating Rink apologized for the error, they also wrongly claimed that facial recognition software is 97% accurate in detecting facial expressions and features. The actual rate of inaccuracies is so high, in fact, that major retailers such as IBM, Amazon, and Microsoft have paused or ceased selling facial recognition programs.

Those mistakes in facial detection software often prove deadly for Black people in the United States. According to the American Civil Liberties’ Union, when law enforcement uses facial recognition, such an error in detection can lead to wrongful detainment, arrest, and even police brutality

ACLU seeks ban on software

The ACLU is calling for a ban on facial recognition software in law enforcement. “If government agencies like police departments and the FBI are authorized to deploy invasive face surveillance technologies against our communities, these technologies will unquestionably be used to target Black and Brown people merely for existing.”

Such is the experience of young Lamya Robinson, whose afternoon of fun ended with being kicked out of the Riverside Arena Skating Rink. When told that she matched the profile of someone who had gotten into a fight at the facility, “I was like, that is not me. Who is that?” Continued the Black teen, “I was so confused because I’ve never even been there.”

To learn more about the use of facial recognition systems against Black people, including protests against outlets who use facial detection programs, go to Data 4 Black Lives

Erika Stone is a graduate student in the Master of Social Work program at the University of Oklahoma, and a graduate assistant at Schusterman Library. A Chess Memorial Scholar, she has a B.A. in Psychology...