Interview with Casey Reas and Ben Fry

reas\_p7\_s\_0.jpg

Image: Casey Reas, Process 7, 2005

Created by Casey Reas and Ben Fry, Processing is an open source programming language and environment for people who want to program images, animation, and interactions. It is used by students, artists, designers, researchers, and hobbyists for learning, prototyping, and production. It is created to teach fundamentals of computer programming within a visual context and to serve as a software sketchbook and professional production tool. Processing is an alternative to proprietary software tools in the same domain.

I first discovered Processing in 2003 at ITP while exploring different options for creating a set of tutorials about generative algorithms. We quickly realized that Processing could transform our approach to teaching programming and have adopted it as the language learned by all incoming students. I’m thrilled to have this chance to talk to Casey and Ben a little about the origins of Processing, their philosophy, work, and plans for the future. - Daniel Shiffman

How did you each discover computation? What was the first program you wrote and in what language?

Casey Reas: I was very lucky that my dad brought an Apple II into the house in the 1980s. These early home computers encouraged programming and there were books on programming in Basic written for kids. I don't remember if I started with Basic or Logo, but I learned a little with both. I hit a wall and I wasn't motivated to learn more. (I love playing video games on the computer more than writing my own small programs.) I was introduced to Lingo when I was in college, but I only wrote simple scripts for moving back and forth in the timeline and turning on and off sprites. When I shifted from working in print to the Web in 1995, I fell in love with the potential for making and writing software. I engaged fully with C in 1998 when I took classes at NYU extension, something clicked, and I started to really learn for the first time. I quickly moved on to C++, then later to Java and Perl at MIT.

Ben Fry: I started with an Apple II+ and an IBM PC that my Dad brought home from the university, though I can't remember which was first. I learned BASIC on each, and that evolved into other machines (a whole string of Macs starting with the original 128K version) and languages (Pascal, C, C++, PostScript, Perl, Java...) The first program of consequence was a stock market game (ah, the embarrassment) that I sold for $250 when I was in seventh grade.

screen-outline-500px.jpg

Image: Ben Fry, On the Origin of Species: The Preservation of Favoured Traces, 2009 (Still)

 

Tell us a little bit about the origins of Processing. Where and when did you have your first conversation about creating it?

CR: It was sometime in June 2001, as I was finishing up at MIT. We made of list of the basic specs for the environment and drawing functions. It was one 8 ½ x 11 inch typed page. By the fall, Ben had something working and the first workshop took place Japan in August, 2001.

BF: Yeah, revisions 0003 and 0005 were used for a workshop at Musashinio Art University (MUSABI). I spent the first part of the week teaching Design By Numbers and then some of the students tried “Proce55ing”.

When looking at other programming environments geared towards visuals (Design by Numbers, Logo, etc.) what kinds of things did you want to emulate and what did you want to do differently?

CR: For us, the big idea of Processing is the tight integration of a programming environment, a programming language, a community-minded and open-source mentality, and a focus on learning -- created by artists and designers, for their own community. The focus is on writing software within the context of the visual arts. Many other programming environments embodied some of these aspects, but not all.

John Maeda's Design By Numbers is the direct parent of Processing. Our goal was to emulate its simplicity and focus on making images, animation, and interaction. But, we wanted to exceed the limits of DBN: 100 x 100 pixels, grayscale, and integer math. John wrote his account of the origin for Technology Review.

Processing has clearly been influenced heavily by PostScript and Java. We feel our ideas are not inherently tied to Java, but the current versions of Processing are reliant on it.

BF: Right, we wanted to connect the simplicity and immediacy of BASIC or Logo or a scripting language with a more sophisticated language like Java. And we wanted to make the syntax and API very simple and terse so that common-use operations had straightforward naming.

Did you ever think of calling it anything other than Processing?

CR: Not when we started the project, but we've second-guessed the name many times since. First, it was an iterative name with the characters always changing: Pr0ces5ing, Proc3ss1ng, Pr0c355ing, etc. Then we more conservatively started using Proces55ing before we released the alpha version. We made the decision to move to Processing a few years later. I regret not calling it Seal. I really wish we would have called it that, with a cute animal balancing a keyboard on his nose as the mascot. I think Ben wanted to call it Bagel or Potato.

BF: Now he's just making fun of me. The Processing name originally came from a journal idea that Casey and I were talking to MIT Press about. We wanted to write about computational work and the process and ideas behind it, melding reprints of old writings by people like Vannevar Bush with more contemporary thinkers and creators. Neither of us had the time to make the journal happen but the Processing name stuck as we began talking about this other project.

In thinking about Processing's 1.0 release, how much of its life (features, community, users, etc.) differs from your original expectations? What's the biggest surprise that Processing’s community has produced so far?

CR: It's certainly grown much larger than was ever expected, much to our joy (and horror.) It's been an extremely organic process, with key people contributing at different times such as Florian Jenett and Simon Greenwold. (There are lists at the beginning of our book and on the website.) I think the people who make libraries have made the largest impact. They have pushed and pulled Processing in ways never imagined. The libraries have moved things forward and have opened new possibilities.

Roots Multi Touch Tangible Installation Teaser from FlipMu on Vimeo.

Video: Memo Akten, Owen Vallis, Jordan Hochenbaum, Roots, 2008
(From Collection: A curated exhibition of Processing software.)

I'm excited about the ports of the Processing drawing functions to other Programming languages, particularly Processing.js (created by John Resig and continued by Al MacDonald) and Ruby-Processing (created by Jeremy Ashkenas). There are more for Scala, Python, ActionScript, and I hope many more to come.

The entrepreneurial community initiatives have extended and built more community. Sinan Ascioglu's OpenProcessing is a wonderful gift to the community and he's doing an excellent job with it. Tom Carden's Processing Blogs aggregator came online as blogging exploded and Marius Watz started the Flickr Group in Spring 2006 and put a lot of energy there.

Ben and I were discussing last week that the original vision of Processing has been fulfilled and this vision is now eight years old. The way you program with Processing is practical and useful, but it's not a radical vision of the future. (It was a pragmatic vision eight years ago.) There's open territory for more radical and visionary projects.

The best surprise has been the Arduino project. Wow! It's really transformed how electronics is learned within the design and art programs, and it has an even further reach into a growing hobby community. Arduino was built around the Wiring project of Hernando Barragan. Wiring was Hernando's thesis project at the Interaction Design Institute Ivrea. It was intended to be an electronics version of Processing that used our programming environment and was patterned after the Processing syntax. It was supervised by myself and Massimo Banzi, an Arduino founder. I don't think Arduino would exist without Wiring and I don't think Wiring would exist without Processing. And I know Processing would certainly not exist without Design By Numbers and John Maeda. Etc. This is what is exciting to me - the iteration and growth of this community.

Processing has an incredibly active and generous community of users. In developing Processing, what have you learned about building that community? How much do you choose to direct and how much just emerges?

CR: We don't direct anything, things happens. We strive to create participation systems with the right balance of structure and freedom to encourage people to contribute to the project autonomously. This has worked well with the libraries, and has failed with other aspects. I often feel that I don't know enough about building and leading a community. Why did I fail twice to encourage community members to create a translation system for the reference? Why aren't people writing amazing apps using the new Tools feature for the programming environment?

If think if people really want something, they do it regardless. The trick is to create an ecology that not only supports these unique individuals; it also encourages others to participate as well.

It's clear that Processing itself was born out of a desire to create something that you could use in your own work as artists and designers. How much has Processing fed back into your own work and influenced projects and pieces you've created?

CR: We both using Processing in almost all of our work, sometimes one component along with other software and sometimes made entirely with Processing. We sketch with Processing, which means we try out ideas in code, and we use it for final work. When we need to do something that Processing can't do, we extend Processing to make the idea possible. (This is also why most libraries are written as well. Processing can't do it? Write a library.)

What are your goals for Processing in the future?

CR/BF: We're only looking as far into the future as 2.0. We're planning a 1.5 release before that, which will have two additional components. First, the current video system using QuickTime will be replaced by GStreamer. Second, Processing will become more integrated with OpenGL, which will improve the speed of apps that use OpenGL. One of Casey's former students, Andres Colubri is the protagonist for the GStreamer and OpenGL integration. For 2.0, the text editor (and the development environment, to an extent) will be modernized to include useful features for beginners and experts. At least that's the plan, it all depends on how much time we have and the contributions of others.

We also want to focus on supporting other projects that extend Processing in different ways. We'd like to spend more time supporting people who are creating Tools and Libraries for Processing, or those who are developing versions that run with other languages (JavaScript, Python, etc.)

tinysketch.gif

Image: Screengrab of a few submissions to the Tiny Sketch competition. (To view the whole gallery, go here)

 

Considering the 200 character limit in the Tiny Sketch competition, how do constraints play a role in your own work? Do you ever impose them yourself in the creation of an artwork? What constraints emerge in working with visualizing data?

CR: I don't intentionally constrain my work, but I always feel constrained by the limits of my mind. This is one reason I write software, to remove some constraints at the expense of others. I write software to draw millions of lines in a few seconds, to make thousands of calculations and decisions in a fraction of a second, to go beyond what my mind can imagine without its digital extension. Writing software makes it easier to work with systems and to imagine detailed networks - this is my love.

BF: Sometimes constraints are a comfort because they provide structure for a project by reducing the space of possibilities, which also adds more of a problem-solving aspect to it. I tend to enjoy constraints like the Tiny Sketch competition, but despise constraints when I have to spend time on larger projects worrying about CPU speed, particularly when working with data. Optimization can be a relaxing exercise like solving a puzzle, but it's frustrating to know that a faster machine is around the corner, and that it's distracting you from improving the project at hand.

Daniel Shiffman works as an Assistant Arts Professor at the Interactive Telecommunications Program at NYU’s Tisch School of the Arts. Originally from Baltimore, Daniel received a BA in Mathematics and Philosophy from Yale University and a Master's Degree from ITP. He is the author of Learning Processing: A Beginner’s Guide to Programming Images, Animation, and Interaction. For more information, visit www.shiffman.net