It's a disturbing development that combines the most worrisome aspects of algorithmic and big data technology with the chilling and dangerous threats inherent in facial recognition.
A Chicago tech company is advertising its "predictive video" to anticipate behavior "based on the emotional state and personality style of any person in a video." In Russia, the app FindFace gives users "the power to identify total strangers on the street," according to The Atlantic.
It's not just the tech fringe, either. Google's new chat app Allo has a "smart reply" feature that apparently analyzes photos from contacts and offers suggested responses to them.
But most troubling is the Israeli startupFaception. It offers a product that combines machine learning with facial recognition to "identify everything from great poker players to extroverts, pedophiles, geniuses, and white collar criminals." A Department of Homeland Security contractor has hired the firm to "help identify terrorists."
That's a problem. The government should not use people's faces as a way of tagging them with life-altering labels. The technology isn't even accurate. Faception’s own estimate for certain traits is a 20 percent error rate. Even if those optimistic numbers hold, that means for every 100 people, the best-case scenario is that 20 get wrongly branded as a terrorist.
Read the full piece at The Christian Science Monitor.