Katrina Sluis: In Facial Weaponization Suite (2011-14) you explore facial recognition technologies and their relationship to neoliberal governance, dataveillance, and biopower. Can you explain how your concept of “informatic opacity” emerged from this project, and how it departs from conventional narratives of individual privacy and surveillance?
Zach Blas: Facial Weaponization Suite is a series of mask-making workshops, in which I aggregated 3D scans of participants’ faces and then used that data to create “collective masks.” Resultantly, the masks are not identifiable as human faces by biometric facial recognition technologies. The masks are worn in performances, actions, and interventions that comment on the politics of biometrics and also experiment with other modes of recognition.
Installation at “transmediale: CAPTURE ALL,” curated by Daphne Dragona and Robert Sakrowski, Haus der Kulturen der Welt, Berlin, Germany, 2015. Photo: Paco Neumann.
I began this project in 2011, but interest in the work came in 2013, after the Snowden leaks. Most people wanted to engage Facial Weaponization Suite around discussions of surveillance and privacy; yet, these terms were not the focal points for me. In fact, I find surveillance and privacy, as conceptions of power and resistance, lacking in complexity and transformative potential. Surveillance etymologically means to oversee or watch from above; I find this emphasis on vision insufficient to characterize the informatic nature of “surveillance” today. The same goes for the overuse of the panopticon as an accurate diagram of contemporary power relations. The computer is not a panopticon. We need other concepts, new terms, and politics invested in radical change.
That said, I was drawn to approach biometrics through “capture,” a term that describes how bodies and identities become algorithmically standardized in order to be informatically legible. In particular, I was influenced by Philip Agre’s 1994 essay “Surveillance and Capture: Two Models of Privacy,” in which he differentiates between surveillance and capture, explaining that what we think of as contemporary surveillance could be more aptly termed capture. Agre argues that capture is more linguistic than visual, emphasizing computation, algorithms, and the need to develop a method for informatically standardizing the assessment of bodies, identities, behavior, and movement. Broadly, capture is the technical precondition for something like global surveillance to emerge, as algorithmic evaluation of persons must be compatible between various governments, militaries, private security companies, etc. Capture is the basis for biometric governance. From a minoritarian perspective, capture is compelling to consider because this means attending to how technical norms—or protocols—get produced and then sedimented into digital logics and machines (for biometrics, these are technical norms of identity and identification). When a machine identities a face, this might seem technically objective, but when researching capture, one learns that there are always certain norms and statistical averages that constitute what a face is for the machine (like being white or cisgender). Humans write capture algorithms, and that means that human bias is often found in the very technical architectures of capture. I became interested in exploring who is negatively impacted by biometric capture. As it turns out, unsurprisingly, a broad set of minoritarian persons suffer the structural violence of biometric governance, such as people of color, transgender persons, and immigrants.