The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
Amazon, the company synonymous with online shopping, is supplying facial recognition technology to government and law enforcement agencies over its web services platform. Branded Rekognition, the technology is every bit as dystopian as it sounds.
Co-authored with Evan Selinger.
Until recently, concerns over facial recognition technologies were largely theoretical. Only a few companies could create databases of names and faces large enough to identify significant portions of the population by sight. These companies had little motivation to widely exploit this technology in invasive ways.
Perceiving whether someone is sad, happy, or angry by the way he turns up his nose or knits his brow comes naturally to humans. Most of us are good at reading faces. Really good, it turns out.
So what happens when computers catch up to us? Recent advances in facial recognition technology could give anyone sporting a future iteration of Google Glass the ability to detect inconsistencies between what someone says (in words) and what that person says (with a facial expression). Technology is surpassing our ability to discern such nuances.
"Axon’s widespread reach in police tech means that if the company decides to implement face recognition, it could have sweeping, rapid effects. “If and when they want to do it, that could happen quite fast,” cautions Harlan Yu, executive director of Upturn. “It would really be a software update away to add that capability.”"
"Jennifer King, director of consumer privacy at Stanford Law School’s Center for Internet and Society, said false identification is among her biggest concerns. She likened it to the use of license plate readers that aim to catch people breaking traffic laws but also identify the wrong cars ─ and make it difficult for the innocent to appeal.
"Perhaps, or perhaps not, said Woodrow Hartzog, who teaches law and computer science at Northeastern University. "The idea that this is simply neutral technology that can be used for good or evil and Amazon shouldn't be responsible, I think is purely wrong," he said.
"It's not unreasonable to say if you build a product that is capable of harm than you should be responsible for the design choices you make for enabling the harm," he said, "and when you release it out into the world, you're doing so in a safe and sustainable way.""
"“This is an example of the growing trend of authoritarian use of technology to track and stalk immigrant communities,” said Malkia Cyril, the executive director of the Center for Media Justice. “It’s absolutely a violation of our democratic rights, and we are definitely going to fight back.”"
"“We have it being used in unaccountable ways and with no regulation,” said Malkia Cyril, executive director of the Center for Media Justice, a nonprofit civil rights organization that signed the A.C.L.U.’s letter to Amazon."
"Harlan Yu, executive director of Upturn, which monitors police agencies' body camera policies and is one of the letter's signatories, said the groups felt it important to "draw a bright ethical line" around real-time facial recognition body cameras. He also said the groups were troubled by the lack of representation on the ethics board of communities, particularly those made up of racial minorities, that are subject to intense police scrutiny.
"Just because real-time face recognition might be technologically feasible to do doesn’t mean they should," Yu said."
"Peter Asaro, a philosopher of technology at the New School in New York, said it may be best to see how authorities use the tools before getting too specific with regulations.
“I think we need transparency and improved privacy regulations now, and will likely need more as new applications emerge,” Asaro said."