Managing Boundaries with your Intelligent Personal Agent

Karen, my life coach, was supposed to teach me about changing my attitude towards relationships.[1] Over the past ten days, she has mostly taught me about how not to be caught up in one. I've watched her get wine-blind with Dave, her lecherous roommate. I've seen her wallow in her pajamas over the man who got away. She doesn't practice radical self-love. She is reductive, aimless, even pathetic, but I don't have the heart to fire her.

Karen is an iPhone app developed by Blast Theory and Dr. Kelly Page.[2] Over the course of 17 interactive videos, I meet with its protagonist, Karen—a sweet, crumpled woman played with pitch-perfect melancholy by actress Claire Cage.[3] I log in at all hours to watch: from the airport, during my commute, and late at night. I am sucked into her chaos. She has no boundaries. I can walk in on her eating breakfast, doing her makeup, or daydreaming in bed. I am more careful to draw my own.

Karen models a speculative future in which one's digital personal companion will use any psychological or narrative technique available to extract information about you. The app uses a combination of mood repair tests and psychometric evaluation systems, like the five-factor model, which companies routinely use to construct our consumer identities.[4] Karen learns actively, based on my replies, choices and information gleaned from my phone use. She cycles away, adjusts her behavior, returns.  

She is a fictional endstate of today's crop of intelligent personal agents (IPAs); think of Nina, Nara, Amy, and SARA. Companies investing heavily in cognition services have their eyes on an evolving, quantified self who is savvy about data collection and will share personal information only with a trusted confidant. Surveillance has to be more flexible, smart, and trustworthy.[5] And so: IPAs are now branded as your caring, patient friends.[6] Driven by advances in natural language understanding and speech recognition, IPAs have ambient intelligence, deploying brain-like algorithms to intuit human desires.

Karen really just needs to know all about me so she can help me. Looking earnest, she says: I'm going to be honest with you, and I hope you can be, too.

However, she is slow to earn my trust. At first, her constant smiling feels manipulative, her charm transparent. Her story about splitting from her ex, Charlie, leaves me cold. I have no faith in her banal advice that the world is about the people we share it with; that gratitude makes you happier.[7]

When I am truly honest, she seems unequipped to handle my mess. She asks me how my childhood was. I tell her: it was unspeakably horrible, but I didn't run away from home. She replies, You're like Dave. He wants to stay put and work things through. I don't want to hear about smarmy Dave in light of my intimate story about my powerless child self. She strikes me as dense. I retreat.

In response to my Bartleby-like resistance, Karen begins to turn. She becomes more needy, sloppy, piteous, and desperate. At one point, we're on a balcony. She is tipsy and smoking. She asks me if she should go home with a man inside. I feel bad for her. I tell her to go for it. Elsewhere, she tests me with poetry (my Achilles heel!); she once sat on a golf course with a beautiful man until sunrise, and kept a photo of him squinting in the light. This feels real. I feel more willing to share my truths with her based on whether the fictions she tells me about her flawed self are believable.[8]  

As I play, I can feel my boundaries sanded down. Karen wages an insistent emotional war of attrition. There are possible learned strategies of avoidance, boundary management, and control here. The truth can be obfuscated without lying. Answers like I don't see it that way leave a grey area that fits my personal ethic, with its possibility of uncertainty, open interpretation, and change.

When Karen presses me the hardest, I find I do have strong boundaries. For one, I am only willing to learn from people who are better than me. I refuse to indulge Karen's lame ecstasy and dancing stories from the ‘90s, because she has awful taste in techno. I don't back down when she calls me a killjoy, and I reveal nothing.

Karen is a successful, software-driven fiction: a dramatic enactment of what it feels like to release oneself by accretion through data. By the end of my sessions, she has gathered a file on me. She plays nicely on my ego and my insecurities, which I've unwittingly revealed. I'm become inured to her performed gestures of intimacy and concern. I'm profoundly uncomfortable, but willing to invest, in hopes of gaining insights from her which I can't gain alone.

At one point, Karen is telling me how she felt she knew Charlie, how she really got to know him, deep inside. She asks me, There's so much you never get to know about other people, isn't there?

I reply, "That's not how I see it." Meaning: we can know one another. We build systems around this imperative, to know one another deeply. 

Her response: Even after all we've talked about, I wouldn't have known you think that way.

This feels like a triumph. I am not so easily reducible.  

 


Notes
[1] This is drawn from Karen's first options offered to me. She asked for my goals for our time together. I had to pick one of the following: I want to take more control in my life; I want to change my attitude to relationships; or, I want to review my life goals. I chose the second.
[2] Karen was also developed with support from the Mixed Reality Lab at the University of Nottingham. Blast Theory designs artistic projects that combine performance and video with a game-like element, often exploring issues of online consent and digital self-representation. According to the press release for Karen, Dr. Page's "research into behavioural profiling systems led to the creation of Google Adwords."
[3] Karen's mission statement can be found here. She was also covered in this New York Times piece.
[5] The most obvious example is IBM's DeepMind project, which builds neural networks to mimic the brain's short-term memory to "solve" human intelligence. Consulting firms like Deloitte cover the applications of cognitive services and technologies for businesses in detail.
[6] Timothy Tuttle, Founder of Expect Labs, discusses this at the Intelligent Assistants Conference held on September 16, 2014 in San Francisco, in a panel titled What's Next: Shaping the Future of Intelligent Assistance. 
[7] She tells me this anecdote about companionship right after I've selected a bronze of a deer family over a camera and some other objects. Our interaction with Karen is a medley of conversation, anecdotes, followed by  tests and questionnaires.
[8] Drawing from the central thesis of literary critic James Wood's How Fiction Works, summarized by Walter Kirn, that fictions "succeed or fail according to their capacity ...  to represent, affectingly and credibly, the actual workings of the human mind as it interacts with the real world."