Listen to this article here
|
On January 21, Drake performed at the Apollo Theatre in Harlem, New York, where exiting attendees were shocked to find an NYPD camera pointed at their faces.
New York Times music writer Jon Caramanica tweeted Saturday night that the NYPD was filming people leaving the show at the Apollo Theater.
He included a video showing an officer recording those coming out.
The NYPD said Monday the video would only be used for a social media post, but there were still calls for the material to be deleted, raising immediate concerns from privacy advocates online about the intended purpose of the intrusive footage.
The five second clip went viral, with 20 million views and counting.
In a statement, the department said the officer was part of the social media team for the local precinct and was getting video for a social media post about local events.
“The video will not be utilized for any other reason,” the department said.
Yet according to CBS News, the Surveillance Technology Oversight Project, a New York City-based group that focuses on privacy and civil rights, called the videotaping “highly concerning” since attendees were being surveilled without their consent.
It demanded that the video be destroyed and that the NYPD reveal “if it had been used for facial recognition.”
The group also repeated that it wants state and city bans on police facial recognition software and facial recognition software in general at sporting events, concerts, and other public events.
In a statement to Eyewitness News, the NYPD said the officer seen in the blue jacket holding the camera is from the 28th Precinct social media team and that the officer was taking video for an upcoming Twitter post that will highlight local community events.
According to ABC NY-7, the 28th Precinct has been posting highlights from local events, trying to promote a positive relationship between the community and the NYPD.
Mayor Adams backs NYPD, dismisses privacy concerns from those “sitting at home”
“When you have those that are sitting at home in the corner of the room, trying to find a reason to divide NYPD from everyday New Yorkers, then they are going to say that,” said Mayor Eric Adams. “Thumbs up to that great captain up in the 28 Precinct. I know that precinct. I know that captain. He’s very community-minded and community-centered and I commend him for doing so.”
While Mayor Adams may applaud the well-intentioned efforts, critics continue to argue arbitrary police surveillance sets a dangerous precedent.
“Experts believe that facial recognition is so uniquely dangerous, and is something more akin to nuclear or biological weapons, where it’s so profoundly harmful, it has such an enormous potential for harm to our basic human rights, [and] to people’s safety,” says Evan Greer, the director of Fight for the Future, a digital rights organization.
Some versions of the tech have shown to be less adept at differentiating between people with darker complexions in the past.
And Greer says that traditional law enforcement surveillance has also historically led to the over-policing of communities of color. They fear combining the two could lead to an amplified effect.
Facial recognition technology is criminally flawed against darker skin
“Facial recognition technology tends to misidentify people of color, and in particular, women of color,” says Hannah Bloch-Wehba, an associate law professor at Texas A&M who specializes in privacy, technology, and democratic governance. “And so I could see a serious concern about the sort of racial and gender bias implications of this kind of tech being used to screen people.”
Over the past few years, a number of Black men have been falsely identified as suspects in criminal investigations that used facial recognition software, in some cases resulting in wrongful arrests and charges.
According to NPR, currently, facial recognition technology is legal in New York City, and there is no federal law that specifically deals with facial recognition.
“You can change your name, you can change your social security number, you can change almost anything, but you can’t change your face,” said Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project (STOP) based in New York. “So if your biometric data is compromised once, it’s compromised for life.”