The Politicization of Faces: Remaining Faceless in the Mass Surveillance Era

Thursday 2 March 13:55

Facial Weaponizion Suite and the Face Cage by Zach Blas
by Carly Sheridan Going through airport security, artist Zach Blas steps into the full body scanner with feet at a hip width distance, arms held high above his head. As his body is scanned, he considers the vulnerability of his physical stance, questions the methods of the machine and thinks of those who are not afforded the luxury of continuing their journey without intrusion. Facial recognition technology and biometric data collection have been inserted into our daily lives, often in instances completely unbeknownst to the surveyed. We’ve willfully traded convenience for fingerprints. Our phones track our every move, while our favorite tools and apps store our faces and those of our loved ones. Corporations can now successfully predict our behavior, political leanings and even major life decisions - like changing careers or building a family - before we’ve consciously made the decision ourselves. Our environments have become equipped to survey us and offer tailored versions based on who the algorithms think we are. Bad data, biased algorithms and the weaponization of faces and gender was a common theme in the lectures at this year’s Sonic Acts Festival. Maryam Monalisa Gharavi, who spoke on the “Unbinding Mediatisation” panel, pointed to historical and present day examples of the fetishized covering and revealing of women’s bodies and faces. “The face is not covered or uncovered, it is by definition a mask,” she argued. She spoke of false positives in facial recognition software, the implications this can hold and the threats that are born out of this aggregated and accumulated data. These types of technology and software are systems designed to manipulate, discriminate, monitor and control the population. “It is not that the face itself becomes inhuman when turned from noise to signal to algorithmic code,” she explained. “The inhumanity is the failure of the truth-telling device itself, and I come to define the inhuman as the place where the object is revealed, rightly or wrongly, while the revealing subject remains covered.” Resistance to mass surveillance in an era obsessed with selfies, geolocation and self-tracking seems contradictory at best, impossible at worst. Faces are sometimes collected and stored with the intention of finding criminals, terrorists or undocumented immigrants, but what happens when you’re misclassified? Countries are passing anti-mask and anti-protest legislation. Hooded sweatshirts have become directly linked to racial profiling and have been used as a symbol in the Black Lives Matter movement to represent black masculinity and as an ode to Trayvon Martin. This raises the question of what happens should you be identified at a protest or, gasp, wearing a hoodie. Biometric technology analyzes and calculates the body to identify, authenticate and verify. In Zach Blas’ exhibition “Facial Weaponization Suite, Face Cages” which ran concurrently with the festival, he summarizes this tactic and its irony, describing how “the captured data is treated as more accurate than the actual embodiment.” For any of this technology to work, there needs be a language in place that defines and dictates what should equal what. People are demonized, or protected, based on biometric data and algorithms that adhere to a ‘one-size fits all’ theory, relying on categories that are calibrated for whiteness. He gave an example of a photo where a black man’s face was not detected while the image of a white woman’s face on his t-shirt was. These generic simplifications are deemed universal. Discrimination against race, gender, class and disability is literally written into code. Again to quote Blas, “It is political, not philosophical.” Research into biometric abstraction and political violence of the face inspired his “Face Cages” work, for which Blas and three other artists collaborated. Each artist took a biometric scan of their own faces and fabricated the nets into metal masks, then wore these masks until they became unbearable, as an endurance performance. One of the things they discovered is that while these biometric diagrams are meant to be perfect representations of their faces, when made physical they were awkward and uncomfortable. A mask designed specifically for their own faces didn’t quite fit. A perfect replica was the promise, yet not the reality. One of the biggest problems with the weaponization and politicization of our faces and bodies is that our definitions and identities are constantly evolving. Gender and sexuality, once defined by a binary, are now acknowledged as something more complex. In an increasing state of globalization and mobilization, citizenship and the cultures we closely identify with can be all-encompassing. Even if the algorithms don’t target you specifically today, that all changes should your definition ever change or that of the state's. As writer and theorist Wendy Hui Kyong Chun so eloquently put it during the “Updates Available?” panel: “Even the most liberal societies have been lush hosts for the weeds of fascism.” Chun highlighted the notion of more data not meaning better data, with the caveat that better data is not a solution either. If networks segregate, it’s because society is still segregated.” While the ability to detect emotions is on the rise, empathy cannot be written into code. Any systematic change in a system that proliferates false positives must come from resistance. “What if we got the least read articles rather than the most read?” Chun asked the audience. “What if we built networks based on mutual indifference? What if we embedded history in these models?" “Let’s exploit and explore the noisiness of being to realize that the gap between model and reality, that is the space for political agency. Space to create new habits, new worlds and new networks.”

This site uses cookies.