Ben Grosser is an artist, composer, and programmer based in Urbana-Champaign, Illinois. His work is highly attuned to the role of computation in changing and producing aesthetics, knowledge and social formations and much of it is available to view online at http://bengrosser.com/. Recently, Ben made a new piece of software available. Facebook Demetricator is a tool for adapting the social network's interface so that the numerical data it foregrounds is removed. No longer is the focus on how many friends one has or how many comments they've gotten, but on who those friends are and what they've written. The following interview took place by email in September 2012:
Facebook uses numbers as a key part of the information provided on its interface. Things, or what are there rendered as things, such as likes, friends, comments waiting, events, are all numbered as are the relation of several other kinds of things to time. Facebook Demetricator suggests that Facebook users might step away from enumeration as a way of understanding the service. What role, for you, does the number play in Facebook, and what does the Demetricator propose?
As a regular user of Facebook I continually find myself being enticed by these numbers. How many friends do I have? How much do people like my status? I focus on these quantifications, watching for the counts of responses rather than the responses themselves, or waiting for numbers of friend requests to appear rather than looking for meaningful connections. In other words, these numbers lead me to evaluate my participation within the system from a metricated viewpoint.
What's going on here is that these quantifications of social connection play right into my capitalism-inspired desire for more. This isn't surprising as we're living in a time when our collective obsession with metrics plays out as an insatiable desire to make every number go higher. How much money did I earn? How many choices do I have? Perhaps the most destructive example of this is the recent financial crisis, when a constant desire for more led the global economy into financial ruin.
Bringing this back to Facebook, I find myself asking questions about how it affects user behavior. Would we add as many friends if we weren't constantly presented with a running total and told that adding another is "+1"? Would we write as many status messages if Facebook didn't reduce its responses (and their authors) to an aggregate value? In other words, the site's relentless focus on quantity leads me to continually measure the value of my social connections within metric terms.
In response, Facebook Demetricator invites the site's users to try the system without these things, to see how their experience is changed by their absence, to enable a network society that isn't dependent on quantification. Who are my friends? How do they think? What have they said?
Along the way Demetricator explores how the designs of software prescribe certain behaviors, and questions the motivations behind those designs. What purpose does this enumeration serve for a system (and a corporation) that depends on its user's continued free labor to produce the information that fills its databases? Where does it lead when quantity, not quality is foremost?
Can you tell us what Facebook Demetricator essentially does and how?
Most simply, Facebook Demetricator changes how Facebook looks to its users by hiding all the metrics within the interface. For example, if the text under someone's photo says 'You and 4 other people like this' Demetricator will change it to 'You and other people like this'. Under an ad, '23,413 people like this' becomes 'people like this'. '8 mutual friends' becomes 'mutual friends'. The user can still click on a link and count up their mutual friends if they care about reducing them to a single count, but under the influence of Demetricator that foregrounded quantification is no longer visible. These removals happen everywhere: on the news feed, the profile, the events page, within pop-ups, etc. Users can toggle the demetrication, turning it on or off when desired. Its default state is on (numbers hidden).
To make this possible, Demetricator is software that runs within the web browser, constantly watching Facebook when it's loaded and removing the metrics wherever they occur. This is true not only of those counts that show up on the user's first visit, but also of anything that gets dynamically inserted into the interface over time. The demetrication is not a brute-force removal of all numbers within the site, but is instead a targeted operation that focuses on only those places where Facebook has chosen to reveal a count from their database. Thus, numbers a user writes into their status, their times for an event, etc. are not removed.
Is there any kind of difference that you see as significant between what is and what is not enumerated in Facebook?
I suspect that Facebook enumerates everything. If it resides within their databases then the counts are easily obtained. However not all of these counts are shown to the user. So the question then becomes which metrics does Facebook reveal to its users and which does it keep to itself? What is the difference between them? Further, what drives those decisions?
I would suggest that the primary question asked by Facebook's designers when deciding which metrics to reveal is whether a particular count will increase or decrease user participation. Am I more likely to click on a 'trending article' if I only see its title or if that title is accompanied by a message indicating that 131,394 other people read it before me? If the latter then the metric is revealed.
So what isn't shown? Well, I'm not told how many things I like per hour, or how many ads I click per day, or how effective the 'People You May Know' box is in getting me to add more friends to my network. These types of analytics are certainly a significant element within the system, guiding personalization algorithms, informing ad selection choices, etc. But would showing these types of metrics to the user make them more or less likely to participate? If the answer is less then the metric is hidden.
Despite my argument above, I would also speculate that some of these decisions are not as well considered as we might expect. That the relational database structure underlying Facebook simply lends itself to metrication. In other words, the counts are already there, so why not add them to the interface? This has an added benefit of giving that data a degree of authority, just as a parenthetical reference within a text can do. Adding a metric to a line in Facebook implies that the data goes deeper, that there's more to know than what you see.
Facebook Demetricator seems to offer almost the opposite service to those of agencies such as EdgeRank Checker that aim to enable Facebook users to measure and plan their activities on the site via analysis of available data in terms of timing, content kind, pacing and so on. Whereas there is no doubt some strategic self-delusion in any such analytical approach, what do you think about the more overtly manipulative approaches to the engineering of presence on Facebook?
I think the approaches you're referring to play right into the system as it was designed.
Whenever you create an algorithm that manages the presentation of information within a large networked system (such as Facebook's EdgeRank formula), you'll always have users who try to engineer a methodology that preferences their own content. We've seen this with Google for years, where it's a constant back-and-forth battle between them and the black hat SEO crowd.
However, while I haven't researched this, I suspect that systems like EdgeRank Checker are silently cheered on by Facebook. EdgeRank analytics, especially with its preference for new over old, encourages a constant stream of updates from everyone hoping to appear in the news feed. This plays right into Facebook's news feed design, which analogizes a never-ending conversation. The algorithm thus produces the desired behavior in its users.
There's also the ways that Facebook facilitates networked presence (e.g. real-time ticker updates), the ways it mimics said presence (such as the delayed and staggered presentation of 'new' feed items after you've logged in), and the engineered presence of Facebook itself (how it watches your actions, adapts to your interests, etc.). Each of these relies heavily on metrics. You're made aware of other's actions primarily through a metric increase, whether it's a comment count on a status, or an increase in likes. As a whole these counts are ever shifting, visibly undulating throughout the interface, presenting a subtle but tangible reminder of the constant change within.
Within this line of enquiry, there are other sets of metrics operating within Facebook, such as those filters looking for spam accounts—those with 'many' friends but not much profile for instance, or who send the same message out several times within a day. Despite these filters, spam accounts are operative. What kinds of signals to identify these as opposed to 'real' contacts should you look for when using the Facebook Demetricator?
I love this question. You're asking how can we know if someone on Facebook is real or spam without the metrics to guide us. For example, if we can't see our mutual friend count when viewing someone else's profile, how can we be sure we're really friends?
I'd suggest the first line of defense here is to ask yourself if you know the person. Have you ever run into them? Does their name even ring a bell? If you can't remember them well enough to answer those questions, then they might not be your 'friend'!
If you're still not sure, you could message the person, asking them for details on where you've met (online or off) and following up as appropriate. Or you could look at the substance of their activity within the site to see if it looks to be that of a real person or the actions of a spam account.
But what does it say when metrics become our guide to evaluating the likelihood of someone being part of our circle, rather than relying on our recollections of that person outside the system? When did we start needing quantifications to help us choose whether to friend someone or not? How many friends have we added to our network simply because the numbers suggested it?
Facebook is notoriously aggressive in attacking artists who work with its kind of 'public' space. How do you see your project, as something that works in-browser, possibly working around this problem?
I've specifically built the software with this history of Facebook in mind. By running in a layer on top of the site, all manipulations happen after Facebook has delivered their data to the user. In this way, because Demetricator manipulates the presentation of Facebook's data after-the-fact, it is harder for them to thwart it programmatically.
That said, they can break it. Their best option would be to start restructuring their code, changing CSS class names, HTML tags, etc. In anticipation of this I have released the Demetricator as open source, with the hopes that others will help adapt the code to both Facebook's regular changes as well as any restructurings specifically aimed at breaking the project.
It's often been noted that when users first join Facebook there is commonly this initial splurge period of rapidly adding contacts, and contacts of contacts, a behaviour that subsequently settles at a relatively low pace. There may be more or less discrimination used in this phase and there is often the expression of surprise at how many people users know in so many different ways are there, ready to be friended in the same set of uniform manners. How do you think the Facebook Demetricator, if it was used from the outset would effect this exploratory or surprising kind of initial use period?
I think the Demetricator would significantly lessen the initial splurge you describe.
To explain, let me start by talking about what happens when we enter a new physical space full of other people. We look around, see who we know, engage with someone familiar, perhaps striking up a conversation. The room may contain people we don't know well or even people we don't care for, so this engagement tends to target those we do know and/or get along with easily. Along the way we may meet new people, and if we're lucky we might create the potential for a new friend.
So what's different about the virtual space of Facebook? One is that the potential pool of people we might know is much larger than any physical gathering because of its ageographical and asynchronous nature. These two conditions create the possibility for wide engagement and a quickly expanding virtual network that can't be matched in physical space. This is one reason for the initial surge you describe—there's so many options that one can't help but be amazed by them.
But there are also two specific interface design decisions that make this play out much faster and wider than it would otherwise.
First is the architecture of the news feed. Without any friends in one's network, the news feed is inactive. Dead. Boring. When you add one new friend, the feed comes to life—but only at a trickle. Add another and its output doubles. From there, the more friends you add the more active the feed becomes. In other words, this feed, which is the primary spectacle of Facebook, is only usable and/or useful with a significant friend network driving it.
Second is the relentless presence of revealed metrics. Imagine what the physical gathering I describe above would be like if every participant wore a badge proclaiming how many friends they had in that room? Would anyone be content to keep that number at zero? At one, two, or three? Or would they be driven to walk around the space, meeting new people and identifying old acquaintances all so they can increase their public friend count?
In other words, this publically viewable friend metric plays right into what I described earlier as our capitalism-inspired desire for more. When you're constantly being told how many friends you have, you're encouraged to add another, to make that number go higher, to exceed in metric terms. More is better, less is inferior.
By removing this metric, Demetricator will blunt the initial surge of friend acquisition. The absence of metricated social valuation will allow other indicators of friendship to emerge, such as closeness and likeability. Because of the architecture of the news feed, it won't slow it down entirely, but it should change the character of what is now often a frenzied activity.
Certain data on Facebook attracts a visible timestamp. How do you see the differentiation between what the user sees as timestamped and what not, and what effect do you envisage the use of the Demetricator having on the kinds of data that is currently marked in this way?
These timestamps, in and of themselves, wouldn't qualify as metrics at all if it weren't for their hyperspecific and time-relative nature. But by relentlessly reminding us of how old something is they create a false sense of urgency. I really don't need to know that my friend's meme post went live 23 seconds ago rather than 49 seconds ago, or that my colleague ate her banana 23 minutes ago rather than 30 minutes ago. But these constant enumerations of age present the news feed as a running conversation that you can't miss—that if you leave for even a second that something important might pass you by.
Given that Facebook's value is directly tied into how much we all participate, this urgency helps fuel our continuous engagement with the system in the forms of posting, reading, liking, etc. It also plays a supporting role in the engineered presence we discussed earlier, as even if nothing you see gets new likes while you're watching, its age is always changing, reminding us that things are going on, that someone is keeping track of things.
At the same time, many items in the interface lack one of these timestamps. I suspect the choice of where to place one lies with its expected affect on user engagement. If I saw that a trending article is old would I read it? If I knew an ad had been showing for weeks would I click it? Would I feel better if I knew my friend request had been ignored for months? These types of items are likely more effective when their age remains incognito, informing Facebook's internal analytics without showing themselves to us.
The Demetricator acts on the visible timestamps by taking them out of the equation. It converts them into one of two options: 'recently', or 'a while ago'. I expect that doing so will nullify the urgency that constant aging creates, and that it will propose a calmer reading of the news feed less dependent on unbroken attention. If everything within a day is 'recent' (which, frankly, it is), then the user is less likely to worry that they'll miss something. Interaction thus becomes less focused on the present value of the content and more focused on the content itself.