New Fashion May Defeat Facial Recognition Tech

With more than half of all the faces in America now logged into police databases, law enforcement can now track people across the country using facial recognition. Activists, fashion designers, and academics have attempted to push back by developing fashion that allows the human face to mimic something non-human. The aim is to offer a level of privacy back to the average user so that he or she can go about their daily routine without being picked up and tracked on facial recognition software.

How it Works

The facial recognition software that is currently in operation throughout the US uses artificial intelligence to pick out people’s faces from images recorded by technology. Clothing that the designer explicitly created for the purpose to “dazzle” the AI into making it think that the face it’s seeing isn’t human counters this. Alternative implementations create decoy faces that the AI can track that looks nothing like the person whose face it’s attached to.

The current iterations of anti-facial rec technology range from a mask that changes the angles of a human face, to a transparent covering that still allows people to see one’s facial expressions while making it impossible for facial rec to get enough information to figure out who the face belongs to. More sophisticated implementations use a projected alternate look, covering the user’s real face with a shifting image that changes every so often to confuse the recognition algorithm into false detection.

A Necessity in a Surveilled World

With the focus on surveillance techniques like facial recognition coming into the mainstream, regular users need to regain a way of taking back their faces. These fashions were created to help average users reclaim their own privacy by making it nearly impossible for facial recognition software to fix their actual faces. As more governments and private businesses enable this technology, the average user is left with little choice but to rely on the designs of those who would prefer to keep their faces out of databases where their data may be misused.