Facial Recognition Software Prompts Privacy, Racism Concerns in Cities and States

Facial Recognition Software Prompts Privacy, Racism Concerns in Cities and States

Fabian Rogers was none too pleased when the landlord of his rent-stabilized Brooklyn high-rise announced plans to swap out key fobs for a facial recognition system.


He had so many questions: What happened if he didn’t comply? Would he be evicted? And as a young black man, he worried that his biometric data would end up in a police lineup without him ever being arrested. Most of the building’s tenants are people of color, he said, and they already are concerned about overpolicing in their New York neighborhood.


“There’s a lot of scariness that comes with this,” said Rogers, 24, who along with other tenants is trying to legally block his management company from installing the technology.


“You feel like a guinea pig,” Rogers said. “A test subject for this technology.” 


Amid privacy concerns and recent research showing racial disparities in the accuracy of facial recognition technology, some city and state officials are proposing to limit its use.


Law enforcement officials say facial recognition software can be an effective crime-fighting tool, and some landlords say it could enhance security in their buildings. But civil liberties activists worry that vulnerable populations such as residents of public housing or rent-stabilized apartments are at risk for law enforcement overreach.


“This is a very dangerous technology,” said Reema Singh Guliani, senior legislative counsel for the American Civil Liberties Union. “Facial recognition is different from other technologies. You can identify someone from afar. They may never know. And you can do it on a massive scale.”


The earliest forms of facial recognition technology originated in the 1990s, and local law enforcement began using it in 2009. Today, its use has expanded to companies such as Facebook and Apple.


Such software uses biomet ..

Support the originator by clicking the read the rest link below.