Required Reading: Empathy & Disgust

 

Distaste or disgust involves a rejection of an idea that has been offered for enjoyment.

—Immanuel Kant, Anthropology from a Pragmatic Point of View, 1798

For the first time, this year's Seven on Seven will have an overarching theme offered to participants as a provocation: Empathy & Disgust.

Scene from Her

We chose this theme partly because of recent discussions about "affective computing," which aims to detect and respond appropriately to users' emotions. The field gained some visibility after the release of Spike Jonze's Her; writing for Rhizome, Martine Syms argued that the film could be read as "an elaborate product spec" for intelligent agents that can replace human relationships. Recently, a new crop of apps that function as "Intelligent Personal Agents" bring us a step closer to this future, while a more speculative app from Blast Theory offers a fully-fledged emotional relationship with a virtual character who gradually reveals herself to be "needy, sloppy, piteous, and desperate."

Some of the real-world research underpinning emotional analysis was discussed in New Yorker piece earlier this year, focusing on the work of Affectiva and scientist Rana el Kaliouby. The company is developing a tool called Affdex that can "make relable interences about people's emotions" based on video monitoring:

During the 2012 Presidential elections, Kaliouby’s team used Affdex to track more than two hundred people watching clips of the Obama-Romney debates, and concluded that the software was able to predict voting preference with seventy-three-per-cent accuracy.

Outside of the lab, algorithms and networks have already become deeply involved in our emotional lives. In summer 2014, Facebook made headlines for having altered the content of users' News Feeds in order to manipulate their moods for a scientific study. Kate Crawford, writing in The Atlanticargued that the project was an obvious breach of ethics, having clear potential for harm and having been conducted without participants' knowledge. Later in the year, Facebook was accused of "algorithmic cruelty" for non-scientific reasons when its "Year in Review" app paired often painful images from the past year with the words, "Here's what your year looked like!"

What such projects aim for, in part, is the automation of affective labor, the work of managing our emotions and those of others around us. Affective labor has long been discussed in Marxist theory because it serves an important economic purpose, is often marginalized, and is generally un- or underpaid. In the digital era, affective labor has been captured by the market in new ways: by social media platforms, who can convert our baby pictures into advertising and market research dollars, and by "sharing economy" services. As Rob Horning wrote following a 2014 Rhizome panel:

The sharing economy's rise is a reflection of capitalism’s need to find new profit opportunities in aspects of social life once shielded from the market, in leisure time once withdrawn from waged labor, in spaces and affective resources once withheld from becoming a kind of capital.

Image: Reuters

When affective labor is automated, who benefits? Tom Cutterham reflected in The New Inquiry that:

If only we could revolutionize human relationships without touching political-economic structure! If only we could all be more friendly! Those are capital’s desires, not ours. In How to Win Friends and Influence People, Carnegie quoted John D. Rockefeller: "The ability to deal with people," he said, "is as purchasable a commodity as sugar or coffee … and I will pay more for that ability than for any other under the sun." What Carnegie knew is that friendliness doesn’t come naturally under capitalism. He supplied that valuable commodity not so much to his readers as to their bosses. A century later, capital has managed to reduce the price of friendliness to nearly nothing. If you want to keep your job in an age of affective labor, you’ll serve that coffee and sugar with a smile.

Affective computing, of course, has implications outside the service industry. Health care is a major focus. Robot nurses have long been used to deliver medicine; data from social networks and fitness trackers is increasingly used to understand users' emotional well-being and mental health. Syms hinted at this in her short fiction piece "Her Quantified Self," a first-person account of a FuelBand user, writing "She slept for 12 hours and worries for a moment that it means she is depressed." What do the data say about us? And what does it mean when such conclusions may be drawn without our knowledge by governments, employers, and insurers?

Of course, affective computing engages with the entire spectrum of human emotion. Why did we choose to use "empathy and disgust" as our theme?

Bill Gates drinking recycled water

One reason is that these are unusual kinds of emotions. They have as much to do with aesthetics and ethics as they do with intuitive response. Disgust can be a gut reaction, a strong dislike of a smell or the sight of moldy food. But if we feel disgust in order to avoid contamination, that contamination may be moral, rather than biological. Disgust is a necessary response to behaviors that we find repugnant, and there are times when it must be registered. At the same time, disgusting objects hold great fascination (as Christopher Turner pointed out in Cabinet Magazine), and therefore play an important role in art: 

Kant puritanically turned his head away from the paradoxical, hedonistic, and formless intensity of disgust’s pleasures, which threatened to smother him.

Empathy carries with it an obvious moral imperative: we must be willing to imagine the subject positions of others, and to imagine them as valid as the basis for any meaningful collective existence or social life. Strangely, technology can facilitate empathy or its opposite. Yesterday, it allowed a baseball game to be played to television cameras in an empty stadium while we all retreat to our separate corners, full of mutual suspicion. But the network, even with its constant data-gathering and value extraction, can also be a place where we encounter and imagine difference, where we can listen to marginalized voices from Toxic Twitter to the Objectum-Sexual Internationale. 

 

Illustration by Fotolia/wormig

For this auspicious seventh edition of Seven on Seven, we will be convening wonderful thinkers and makers in art and technology tomorrow in the offices of NEW INC on the Bowery for one-day collaborations. The results, which may include new objects for aesthetic contemplation and social function, will be revealed at a live-streamed conference on Saturday. The collaborations will go in a wide range of directions, but we will offer these two modes, empathy and disgust, as a starting point for conversation.