Embodying computation and Borging the Interface

Also from Realityaugmentedblog.com
Perhaps my title is a little hyperbolic, but there seem to be significant developments in regards to Human Computer Interaction happening. I see this all over Kickstarter, but after going to SxSW, this became evident. It seems that gestural, embodied, and ‘borged’ computing all seem to be emerging in force in the next couple years. This makes perfect sense to me, as I remember going to the Standford University archives on the history of computation, and reading Douglas Engelbart’s archives (remember, he did the mouse and the ‘Mother of all Demos’?). In his notes, he chronicled going to IBM’s R&D labs where, in 1958, the mainframe division was trying to avoid the keyboard/display paradigm – remember, Engelbart would invent the mouse four years later, so no K”M”D. They wanted to, using mainframes using 32 K of RAM use speech recognition and synthesis to communicate with the computer.

The future of embodied computation?

45 years later, we’re just starting to crack that nut.

But other interface paradigms have been tried, with more or less success. These range from Ivan Sutherland’s Sketchpad lightpen, Lanier’s goggles and glove VR interface (can we say that the goggles are reemerging with the oddly anachronistic Oculus Rift mask?) But as Terrence McKenna said in the radio program Virtual Paradise in the early 90’s, we need ways in which we can communicate via computers as the objective manipulators we are rather than keyboard cowboys. The “VR Fantasy”, as he put it would be to step into a space where we could communicate directly through symbolic exchange in virtual space, which reminds me of Lanier’s fascination with South Pacific cuttlefish who communicate through manipulating their pigmentation.

Is the key Haptics and Gesture?
There are three technologies, while possessing very different methodologies for their operation that point towards the necessity for the use of gesture in computation. Of course, the first is the Kinect, which, once hacked, has been one of the most revolutionary technologies for DIY 3D scanning, motion capture, and gestural computing. Before, programming environments like MAX/MSP and Isadora dealt with camera blob tracking and so on. Now, we have these self-contained devices that offload tracking computation and give a turnkey paradigm for gesture tracking.

Its little sibling, the Leap Motion, which seems a lot like a small dongle with IR emitters in it tracks fingertip and finger orientation with greater accuracy than the Kinect. What intrigues me about the leap is that as opposed to the large gestures that technologies like the Wii or the Kinect demand, we are presented with commands that could be as subtle as a flutter of the fingers above the keyboard, a fist, or a flip of the hand. Intuitive, gestural computing finally makes sense here. I like what I’ve been doing with my developers’ kit a lot.

The Mother of all Demos for Haptics/Embodied Computing?
But, what seems to be McKenna’s “VR Fantasy” for gestural/haptic computing is the work being done at the MIT Tangible Media Lab and how that technology found its way into the movie, Minority Report. In his 2009 TED talk, John Underkoffer talks about how they translated the work at the lab to the screen through the use of the Tamper system, which consists of motion-capture cameras tracking markered gloves to control multi-station, sensate media across any number of surfaces using a true 3D interface infospace, not just a recreation of a room (a la Second Life). Using mere gestures of the glove, Underkoffer is able to alter search criteria, selectively edit media, and navigate 3D interfaces similar to those in the movie.

Is it any wonder that new laptops are beginning to be shipped with multitouch screens? Could we project that Leaps could take over for the trackpad? Most definitely. In my media theory classes, I state flatly that the reason we use language is directly related to the fact that we are gesturing object manipulators with opposable thumbs, and to me, it seems un-shocking that computational culture has reached this point this late is because – 1: we tend to let go of familiar interfaces slowly, and 2: Moore’s Law hadn’t shrunk the necessary technology sufficiently to not have the boxes be the size of a refrigerator (and the price of an old Silicon Graphics computer). We are animals who build objectively, even in terms of language, and eventually it makes sense that our information devices reflect our cognitive ergonomics of objective construction and gesture. I love the idea of a computer reading body gesture.

“Borged” interfacing – the question of Glass-like infodevices.
In the beginning of this text, I mentioned Doug Engelbart going to IBM R&D in 1958 to talk to them about bypassing the keyboard and screen in lieu of natural, transparent computation. Vuzix, Google and a host of other manufacturers are hustling to get a transparent, augmented headset solution out as soon as possible. Steve Mann pioneered the genre long ago, and was even attacked in France in 2012 due to his appearance with his devices. In some ways, I feel like the ‘borg-glass’ interface might be the solution to IBM’s conundrum, but unless I see some sort of sensor on the body, I find devices like Glass still too far down the ‘brain in a vat’ road, but they do make us mobile, which is intriguing. I’d like to hear more of Amber Case’s ideas on the appropriateness to use of these devices.

So, since I visited Sundance New Horizons in 2009 and played with the Tamper system, I felt that a sea change was coming that finally recognized the body, listened to it, or even sought to merge with it. Kinects, Leaps, Tamper, Glass – these are all the move of computation from the desktop to the body that I wrote about in 1999 (Towards a Culture of Ubiquity) before our entire environments become responsive. But where we are with haptic interfaces represents the coming of a fundamental shift where computers are starting to become more like us, or maybe our interfaces will allow us to be more like computers.