Obscuring facial recognition

Duration: 8min 36sec Views: 565 Submitted: 29.06.2020
Category: POV
Everyone wants to know how you could break a face recognition system — common guesses include sunglasses, a change in facial hair, or large hats. There is a really simple way to defeat most systems. Accuracy in biometrics is a complicated subject that involves conditional probability and advanced math. If you want to know more about that, our CTO wrote a blog article about it.

DHS biometric privacy test of face-obscuring AI is more of a pop quiz

DHS biometric privacy test of face-obscuring AI is more of a pop quiz | Biometric Update

Mass surveillance has become a part of life. And one of the most common ways we are surveilled is through facial recognition surveillance. Once a passive, often unnoticed intrusion of our privacy, facial recognition surveillance is increasingly under focus from fundamental and digital rights groups and the general public. While legislation against facial recognition surveillance has been slow in coming, action has been taken by rights groups, citizens, and even companies.

How to Thwart Facial Recognition

But last spring, I found myself wandering around D. It was a sunny Saturday, the capital swamp neither frigid nor muggy-oppressive—perfect for walking. It took me 45 minutes to get all the makeup on, to get the pencil right and the hair dangled just so. I spent the day hanging out with some friends around Adams Morgan, a neighborhood seemingly developed by former hippies who had gone into non-profit C-suites or opened boutique restaurant-bars. I should step back.
To enable people to obfuscate facial-recognition software programs, Selvaggio, who is 34 and white, made available 3-D, photo-realistic prosthetic masks of his own face to anyone who wants one. Selvaggio thought up the project, which he calls URME Surveillance, when he was living in Chicago, where law-enforcement officials have access to more than 30, interlinked video cameras across the city. He wanted to start conversations about surveillance and what technology does with our identity. He knew that researchers have found that facial-recognition software exhibits racial biases. The programs are often best at identifying white and male faces, because they have been trained on data sets that include disproportionate numbers of them, and particularly bad at identifying black faces.