海角大神

Modern field guide to security and privacy

Opinion: The ugliest side of facial recognition technology

The emergence of technologies that falsely promise to predict someone's behavior based on their facial features and expressions is a deeply troubling development.

|
Toby Melville/Reuters/File
A surveillance camera in front of a poster in London.

It's no mystery that big data presents a challenge to听.听But perhaps more alarming is the emergence of technology that combines听facial recognition听and data analytics to create a powerful surveillance tool.

It's a disturbing development that combines the听most worrisome aspects of algorithmic and big data technology with the听chilling and dangerous threats inherent in facial recognition.

础听听is advertising its "predictive video" to anticipate behavior "based on the emotional state and personality style of any person in a video."听In Russia, the app FindFace gives users "the power to identify total strangers on the street,"听

It's not just the tech fringe, either.听Google's听听has a "smart reply" feature that apparently analyzes photos from contacts and听offers鈥嬏齭uggested鈥嬏齬esponses听to them鈥.

But most troubling is the听Israeli startup . It offers a听product that combines machine learning with facial recognition to "identify everything from great poker players to extroverts, pedophiles, geniuses, and white collar criminals." 础听Department of Homeland Security contractor has hired the firm to "help identify terrorists."

That's a problem. The government should not use people's faces as a way of tagging them with life-altering labels. The technology isn't even听accurate.听Faception鈥檚 for certain traits is a 20 percent error rate. Even if those optimistic numbers hold, that means for every 100 people, the best-case scenario is that 20 get wrongly branded as a terrorist.

And yet, according to the company,听powerful profiling is possible due to two alleged facts: personalities are "affected by genes" and our faces are a "reflection of our DNA."

The first premise doesn鈥檛 inspire confidence. It presumes nature affects our personalities more than nurture 鈥 a conclusion that experts constantly debate. For the sake of argument, though, let鈥檚 say this is true. Even then, we鈥檙e not dealing with a robust causal claim. Saying that personalities are "affected" by genes is a much weaker assertion than maintaining our genes determine them.

As to the face being a reflection of DNA, the folks at Faception admit that the evidence that supports that claim comes from animal studies, not psychological inquiry into human beings. Conveniently, they dismiss听differences by accepting an unnamed听 that "the human face was likely to develop in the same way."

These assumptions completely ignore many established psychological theories such as "situationism," which is when听environmental features dispose all of us to behave in new ways 鈥 ways that can lead a person who habitually does good things to commit evil and atrocious acts.

Physical images certainly have some revelatory power. A snapshot of body language, for instance, can reveal confidence or nervousness.听But it would be serious mistake to view the face alone as a portal into deep character traits and future behavior.

Advocates of this kind of data analysis might argue that algorithms will get better as the data science advances and computers can make more decisions more quickly.听But 听are already plaguing big data.听Economist Ronald Coase famously years ago that if you torture the data long enough, it will confess anything.

But unfortunately听people tend to听place far too much confidence in anything a computer spits out. This phenomenon known as 听looms large with the implementation of predictive facial recognition technologies. A prime example of this was documented in a that exposed听racial bias embedded in predictive criminal risk assessments software.

Even if technology can guess correctly, do we really want to live in a society in which machines try to suss out deep truths based on our facial features? If that were the case, our "faceprints" would serve as听beacons for unwanted attention, threatening our听obscurity听鈥撎齮he idea that when information about us is hard to find, it's safer. And you can't ditch the beacon unless you want to wear a mask听in public for the rest of your life.

Our faces are indeed exceptional. But predictive facial recognition technology and companies like Faception听exacerbate the most dangerous aspects of both big data and facial recognition.

听is a professor of philosophy at听. Follow him on Twitter听.

听is听鈥媡he听鈥鈥媁. Stancil Starnes Professor of Law听鈥at听. Follow him on Twitter听.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
海角大神 was founded in 1908 to lift the standard of journalism and uplift humanity. We aim to 鈥渟peak the truth in love.鈥 Our goal is not to tell you what to think, but to give you the essential knowledge and understanding to come to your own intelligent conclusions. Join us in this mission by subscribing.
QR Code to Opinion: The ugliest side of facial recognition technology
Read this article in
/World/Passcode/Passcode-Voices/2016/0527/Opinion-The-ugliest-side-of-facial-recognition-technology
QR Code to Subscription page
Start your subscription today
/subscribe