Alexander Galloway
Since the beginning
Works in New York, New York United States of America

PORTFOLIO (2)
BIO
Alex works with RSG. Projects include the surveillance tool "Carnivore," "Low Level All Stars" a DVD collection of C64 intros, and the computer game Kriegspiel.
Discussions (75) Opportunities (0) Events (0) Jobs (0)
DISCUSSION

"Protocol"--Excerpt from Chapter 7 "Internet Art"


Protocol: How Control Exists After Decentralization"
Excerpt from Chapter 7 "Internet Art":

Let me now take a closer look at Internet art by examining some of its
specific aesthetic qualities. The Internet's early autonomous
communities were the first space where pure network aesthetics (Web site
specificity) emerged--email lists like 7-11, Nettime, recode, Rhizome,
and Syndicate.

Primitive signs were seen in early net.art projects, such as Alexei
Shulgin's Refresh, an art project consisting of nothing but links
between Web pages. Refresh involves many different organizations working
together, using many different computers all around the world. In
Refresh a chain of Web pages is created. Each page is programmed to link
automatically (on a 10-second delay) to the next Web page in the chain.
Shulgin describes the project as "A Multi-Nodal Web-Surf-Create-Session
for an Unspecified Number of Players." Anyone can collaborate in the
project by slipping his or her own page into the link of refreshes. The
user may load any Web page in the chain, and then watch as a new Web
site appears every several seconds like a slide show.

In this way, Refresh was one of the first works to render the network in
an artistic way--as a painter renders a landscape or a sculptor renders
a physical form. The art exists "out there" in the network, not on any
individual Web page in the chain. Refresh made visible a virtual network
of collaboration that was not based on individual content. Shulgin's
work spatializes the Web. It turns the Internet, and protocol with it,
into a sculpture. [...]

While Shulgin's work is highly conceptual, more formal work was also
produced in this period. Perhaps the best example of formal work is from
the European duo Jodi. For several years Jodi has refined a formal style
by making computers both the subject and content of their art making.
Focusing specifically on those places where computers break down, Jodi
derives a positive computer aesthetic by examining its negative, its
point of collapse.

For example, in Jodi's work 404, which alludes to the Web's ubiquitous
"file not found" 404 error code (which is built into Berners-Lee's HTTP
protocol), the artists use the default fonts and simple colors available
to primitive Web browsers. 404 is a collection of pages where users can
post text messages and see what other users have written. But this
simple bulletin board system becomes confused as the input text is
pushed through various distorting filters before being added to the Web
page for general viewing. The result is a rather curious collection of
bathroom-wall scrawl that foregrounds the protocols of the Web page
itself, rather than trying to cover over the technology with pleasing
graphics or a deliberate design.

The 404 error code has also been used by other artists. Lisa Jevbratt's
"Non-Site Gallery" opens up the dead end of the 404 error page. She
transforms the 404 message into a generative doorway, where the
requested page is generated on the fly, as if it had always existed for
the user and was not the result of a mistake.

The 404 error code was also used in a more conceptual sense by the EDT.
As part of its virtual sit-ins the EDT have created software that sends
out Web requests for nonexistent Web pages on remote servers embedded
with special messages--addresses in the form of
www.server.com/__special_ message__. Since the Web pages do not exist on
the remote server (and were never intended to exist), an error message
is immediately generated by the server and returned to the EDT software.

However--and this is the trick--since Web servers record all traffic to
their Web site including errors, the error acts like a Trojan horse and
the "special message" is recorded in the remote server's log book along
with the rest of its Web traffic. This accomplishes the difficult task
of actually uploading a certain specified piece of information to the
server of one's choice (albeit in a rather obscure, unthreatening
location). As the messages pass from the protester to the protested
site, a relationship is created between the local user and the remote
server, like a type of virtual sculpture.

While the artwork may offer little aesthetic gratification, it has
importance as a conceptual artwork. It moves the moment of art making
outside the aesthetic realm and into the invisible space of protocols:
Web addresses and server error messages.

As work from the EDT suggests, Internet conceptualism is often achieved
through a spatialization of the Web. It turns protocol into a sculpture.
As the Internet changes, expanding its complex digital mass, one sees
that the Web itself is a type of art object--a basis for myriad artistic
projects. It is a space in which the distinction between art and not art
becomes harder and harder to see. It is a space that offers itself up as
art. [...]

The Web Stalker is also a good example of the conceptual nature of
Internet art. It is an alternate browser that offers a completely
different interface for moving through pages on the Web. The Web Stalker
takes the idea of the visual browser (e.g., Netscape Navigator or
Internet Explorer) and turns it on its head. Instead of showing the art
on the Web through interpreting HTML and displaying in-line images, it
exhibits the Web itself as art through a making-visible of its latent
structure. The user opens a Web address, then watches as the Stalker
spits back the HTML source for that address. In a parallel window the
Web Stalker exhaustively maps each page linked from that URL,
exponentially enlarging the group of scanned pages and finally pushing
an entire set of interlinked pages to the user. The pages are mapped in
a deep, complex hypertextual relation.

The Web Stalker doesn't produce art but, in Matthew Fuller's words,
"produces a relationship to art." The Stalker slips into a new category,
the "not-just-art" that exists when revolutionary thinking is
supplemented by aesthetic production.

Let me now propose a simple periodization that will help readers
understand Internet art practice from 1995 to the present. Early
Internet art--the highly conceptual phase known as "net.art"--is
concerned primarily with the network, while later Internet art--what can
be called the corporate or commercial phase--has been concerned
primarily with software. This is the consequence of a rather dramatic
change in the nature of art making concurrent with the control societies
and protocological media discussed throughout this book.

The first phase, net.art, is a dirty aesthetic deeply limited, but also
facilitated, by the network. The network's primary limitation is the
limitation on bandwidth (the speed at which data can travel), but other
limitations also exist such as the primitive nature of simple network
protocols like HTML. Because of this, one sees a type of art making that
is a mapping of the network's technological limitations and failures--as
the wasp is a map of the orchid on which it alights, to use Deleuze and
Guattari's expression. Examples include Jodi, Olia Lialina, Heath
Bunting, Alexei Shulgin, Vuk Cosic, and many others. Net.art is a very
exciting aesthetic, full of creativity and interesting conceptual moves.

Yet this first phase may already be coming to an end. Baumgartel
recently observed that it is "the end of an era. The first formative
period of net culture seems to be over." He is referring to a series of
years from 1995 to 1999 when the genre of net.art was first developed.
In this period, due to prominent technical constraints such as bandwidth
and computer speed, many artists were forced to turn toward conceptual
uses of the Internet that were not hindered by these technical
constraints, or, in fact, made these constrains the subject of the work.
All art media involve constraints, and through these constraints
creativity is born. Net.art is low bandwidth through and through. This
is visible in ASCII art, form art, HTML conceptualism--anything that can
fit quickly and easily through a modem.

But this primary limitation has now begun to disappear. Today Internet
art is much more influenced by the limitations of certain commercial
contexts. These contexts can take many different forms, from commercial
animation suites such as Flash, to the genre of video gaming (a
fundamentally commercial genre), to the corporate aesthetic seen in the
work of RTMark, Etoy, and others. My argument is aesthetic, not
economic. Thus, it is not a question of "selling out" but rather of
moving to a new artistic playing field. As computers and network
bandwidth improved during the late 1990s, the primary physical reality
that governed the aesthetic space of net.art began to fall away. Taking
its place is the more commercial context of software, what may be seen
as a new phase in Internet art.

[Excerpt reprinted with the permission of The MIT Press.]

----

"Protocol: How Control Exists After Decentralization"
by Alexander R. Galloway
The MIT Press (March, 2004), 248 pages, ISBN 0262072475

book homepage: http://mitpress.mit.edu/protocol
table of contents: http://homepages.nyu.edu/~ag111/Protocol-contents.doc
amazon page: http://www.amazon.com/exec/obidos/ASIN/0262072475

DISCUSSION

"Protocol"--Excerpt from Chapter 6 "Tactial Media"


Protocol: How Control Exists After Decentralization"
Excerpt from Chapter 6 "Tactial Media":

Arquilla and Ronfeldt coined the term netwar, which they define as "an
emerging mode of conflict (and crime) at societal levels, short of
traditional military warfare, in which the protagonists use network
forms of organization and related doctrines, strategies, and
technologies attuned to the information age."

Throughout the years new diagrams (also called graphs or organizational
designs) have appeared as solutions or threats to existing ones.
Bureaucracy is a diagram. Hierarchy is one too, as is peer-to-peer.
Designs come and go, serving as useful asset managers at one historical
moment, then disappearing, or perhaps fading only to reemerge later as
useful again. The Cold War was synonymous with a specific military
diagram--bilateral symmetry, mutual assured destruction (MAD),
massiveness, might, containment, deterrence, negotiation; the war
against drugs has a different diagram--multiplicity, specificity, law
and criminality, personal fear, public awareness.

This book is largely about one specific diagram, or organizational
design, called distribution, and its approximate relationship in a
larger historical transformation involving digital computers and
ultimately the control mechanism called protocol.

In this diagramatic narrative it is possible to pick sides and describe
one diagram as the protagonist and another as the antagonist. Thus the
rhizome is thought to be the solution to the tree, the wildcat strike
the solution to the boss's control, Toyotism the solution to
institutional bureaucracy, and so on. Alternately, terrorism is thought
to be the only real threat to state power, the homeless punk rocker a
threat to sedentary domesticity, the guerrilla a threat to the war
machine, the temporary autonomous zone a threat to hegemonic culture,
and so on.

This type of conflict is in fact a conflict between different social
structures, for the terrorist threatens not only through fear and
violence, but specifically through the use of a cellular organizational
structure, a distributed network of secretive combatants, rather than a
centralized organizational structure employed by the police and other
state institutions. Terrorism is a sign that we are in a transitional
moment in history. (Could there ever be anything else?) It signals that
historical actors are not in a relationship of equilibrium, but are
instead grossly mismatched.

It is often observed that, due largely to the original comments of
networking pioneer Paul Baran, the Internet was invented to avoid
certain vulnerabilities of nuclear attack. In Baran's original vision,
the organizational design of the Internet involved a high degree of
redundancy, such that destruction of a part of the network would not
threaten the viability of the network as a whole. After World War II,
strategists called for moving industrial targets outside urban cores in
a direct response to fears of nuclear attack. Peter Galison calls this
dispersion the "constant vigilance against the re-creation of new
centers." These are the same centers that Baran derided as an "Achilles'
heel" and that he longed to purge from the telecommunications network.

"City by city, country by country, the bomb helped drive dispersion,"
Galison continues, highlighting the power of the A-bomb to drive the
push toward distribution in urban planning. Whereas the destruction of a
fleet of Abrams tanks would certainly impinge upon army battlefield
maneuvers, the destruction of a rack of Cisco routers would do little to
slow down broader network communications. Internet traffic would simply
find a new route, thus circumventing the downed machines.

(In this way, destruction must be performed absolutely, or not at all.
"The only way to stop Gnutella," comments WiredPlanet CEO Thomas Hale on
the popular file sharing protocol, "is to turn off the Internet." And
this is shown earlier in my examination of protocol's high penalties
levied against deviation. One is completely compatible with a protocol,
or not at all.)

Thus the Internet can survive attacks not because it is stronger than
the opposition, but precisely because it is weaker. The Internet has a
different diagram than a nuclear attack does; it is in a different
shape. And that new shape happens to be immune to the older.

All the words used to describe the World Trade Center after the attacks
of September 11, 2001, revealed its design vulnerabilities vis-a-vis
terrorists: It was a tower, a center, an icon, a pillar, a hub.
Conversely, terrorists are always described with a different vocabulary:
They are cellular, networked, modular, and nimble. Groups like Al Qaeda
specifically promote a modular, distributed structure based on small
autonomous groups. They write that new recruits "should not know one
another," and that training sessions should be limited to "7

DISCUSSION

book party tomorrow


hey nyc rhizomers... please come to my book party tomorrow if yer free!
-ag

+ + +

Come celebrate the release of two new books on new media:

"Protocol: How Control Exists After Decentralization"
by Alexander R. Galloway

"First Person: New Media as Story, Performance, and Game"
edited by Noah Wardrip-Fruin (and Pat Harrigan)

when: Friday, Feb. 27, 6:00pm
where: Japanese Room, ITP, 721 Broadway, 4th floor, New York City.

refreshments, book signing, Q&A w/ the authors, the works!

Sponsored by the NYU Department of Culture and Communication, NYU's
Interactive Telecommunications Program, and The MIT Press.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

BLURBS & BIOS:

"Protocol: How Control Exists After Decentralization"

by Alexander R. Galloway

Is the Internet a vast arena of unrestricted communication and freely
exchanged information or a regulated, highly structured virtual
bureaucracy? In "Protocol" Alexander R. Galloway argues that the
founding principle of the Net is control, not freedom, and that the
controlling power lies in the technical protocols that make network
connections (and disconnections) possible. He does this by treating
computers as a textual medium that is based on a technological language,
code. Code, he argues, can be subject to the same kind of cultural and
literary analysis as any natural language; computer languages have their
own syntax, grammar, communities, and cultures. He doesn't rely on
established theoretical approaches, but finds a new way to write about
digital media, drawing on his background in computer programming and
critical theory. "Discipline-hopping is a necessity when it comes to
complicated socio-technical topics like protocol," he writes in the
preface. Galloway begins by examining the types of protocols that exist,
including TCP/IP, DNS, and HTML. He then looks at examples of resistance
and subversion-hackers, viruses, cyberfeminism, Internet art--which he
views as emblematic of the larger transformations now taking place
within digital culture. Written for a non-technical audience, "Protocol"
serves as a necessary counterpoint to the wildly utopian visions of the
Net that were so widespread in earlier days.

cover image: http://homepages.nyu.edu/~ag111/cover.jpg

bio:

Alex Galloway teaches in the Media Ecology program at New York
University. Galloway previously worked for several years as editor of
Rhizome.org. He is a founding member of RSG, the software development
group behind the data surveillance platform Carnivore. His first book,
Protocol, is published by the MIT Press.

+ + +

"First Person: New Media as Story, Performance, and Game"

edited by Noah Wardrip-Fruin (with Pat Harrigan)

Electronic games have established a huge international market,
significantly outselling non-digital games; people spend more money on
The Sims than on "Monopoly" or even on "Magic: the Gathering." Yet it is
widely believed that the market for electronic literature - predicted by
some to be the future of the written word - languishes. Even bestselling
author Stephen King achieved disappointing results with his online
publication of "Riding the Bullet" and "The Plant." Isn't it possible,
though, that many hugely successful computer games - those that depend
on or at least utilize storytelling conventions of narrative, character,
and theme - can be seen as examples of electronic literature? And isn't
it likely that the truly significant new forms of electronic literature
will prove to be (like games) so deeply interactive and procedural that
it would be impossible to present them as paper-like "e-books"? The
editors of First Person have gathered a remarkably diverse group of new
media theorists and practitioners to consider the relationship between
"story" and "game," as well as the new kinds of artistic creation
(literary, performative, playful) that have become possible in the
digital environment.

cover image: http://hyperfiction.org/graphics/firstPerson-large.jpg

bio:

Noah Wardrip-Fruin writes for and about new media. In addition to First
Person, he is also coeditor of The New Media Reader (with Nick Montfort)
published last year by the MIT Press. His artwork includes The
Impermanence Agent (a storytelling web agent that "customizes" based on
reader browsing habits) and Screen (an immersive VR text that interacts
with the reader's body). He is a Director of the Electronic Literature
Organization.

DISCUSSION

book excerpt: "Protocol: How Control Exists After Decentralization"


rhizomers..

i wanted to post some excerpts from my new book which i'm very excited
about.. The book is about computer networks and the concept of
"protocol" that ties the networks together. i also have chapters on net
art, tactical media, and hackers. more to come in a couple weeks!

best,

-ag

+ + +

"Protocol: How Control Exists After Decentralization"
by Alexander R. Galloway
The MIT Press (March, 2004), 248 pages, ISBN 0262072475

book homepage: http://mitpress.mit.edu/protocol
table of contents: http://homepages.nyu.edu/~ag111/Protocol-contents.doc
amazon page: http://www.amazon.com/exec/obidos/ASIN/0262072475

---

Excerpt from the "Introduction":

This book is about a diagram, a technology, and a management style. The
diagram is the distributed network, a structural form without center
that resembles a web or meshwork. The technology is the digital
computer, an abstract machine able to perform the work of any other
machine (provided it can be described logically). The management style
is protocol, the principle of organization native to computers in
distributed networks. All three come together to define a new apparatus
of control that has achieved importance at the start of the new
millennium.

Much work has been done recently on theorizing the present historical
moment and on offering periodizations to explain its historical
trajectory. I am particularly inspired by five pages from Gilles
Deleuze, "Postscript on Control Societies," which begin to define a
chronological period after the modern age that is founded neither on the
central control of the sovereign nor on the decentralized control of the
prison or the factory. My book aims to flesh out the specificity of this
third historical wave by focusing on the controlling computer
technologies native to it.

How would control exist after decentralization? In former times control
was a little easier to explain. In what Michel Foucault called the
sovereign societies of the classical era, characterized by centralized
power and sovereign fiat, control existed as an extension of the word
and deed of the master, assisted by violence and other coercive factors.
Later, the disciplinary societies of the modern era took hold, replacing
violence with more bureaucratic forms of command and control.

Deleuze has extended this periodization into the present day by
suggesting that after the disciplinary societies come the societies of
control. Deleuze believed that there exist wholly new technologies
concurrent with the societies of control. "The old sovereign societies
worked with simple machines, levers, pulleys, clocks," he writes, "but
recent disciplinary societies were equipped with thermodynamic
machines... control societies operate with a third generation of
machines, with information technology and computers." Just as Marx
rooted his economic theory in a strict analysis of the factory's
productive machinery, Deleuze heralds the coming productive power of
computers to explain the sociopolitical logics of our own age.

According to Critical Art Ensemble (CAE), the shift from disciplinary
societies to control societies goes something like this:

"Before computerized information management, the heart of
institutional command and control was easy to locate. In fact, the
conspicuous appearance of the halls of power was used by regimes to
maintain their hegemony.... Even though the monuments of power still
stand, visibly present in stable locations, the agency that
maintains power is neither visible nor stable. Power no longer
permanently resides in these monuments, and command and control now
move about as desired."

The most extensive "computerized information management" system existing
today is the Internet. The Internet is a global distributed computer
network. It has its roots in the American academic and military culture
of the 1950s and 1960s. In the late 1950s, in response to the Soviet
Sputnik launch and other fears connected to the Cold War, Paul Baran at
the Rand Corporation decided to create a computer network that was
independent of centralized command and control, and would thus be able
to withstand a nuclear attack that targets such centralized hubs. In
August 1964, he published an eleven-volume memorandum for the Rand
Corporation outlining his research.

Baran's network was based on a technology called packet-switching that
allows messages to break themselves apart into small fragments. Each
fragment, or packet, is able to find its own way to its destination.
Once there, the packets reassemble to create the original message. In
1969, the Advanced Research Projects Agency (ARPA) at the U.S.
Department of Defense started the ARPAnet, the first network to use
Baran's packet-switching technology. The ARPAnet allowed academics to
share resources and transfer files. In its early years, the ARPAnet
(later renamed DARPAnet) existed unnoticed by the outside world, with
only a few hundred participating computers, or "hosts."

All addressing for this network was maintained by a single machine
located at the Stanford Research Institute in Menlo Park, California. By
1984 the network had grown larger. Paul Mockapetris invented a new
addressing scheme, this one decentralized, called the Domain Name System
(DNS).

The computers had changed also. By the late 1970s and early 1980s
personal computers were coming to market and appearing in homes and
offices. In 1977, researchers at Berkeley released the highly
influential "BSD" flavor of the UNIX operating system, which was
available to other institutions at virtually no cost. With the help of
BSD, UNIX would become the most important computer operating system of
the 1980s.

In the early 1980s, the suite of protocols known as TCP/IP (Transmission
Control Protocol/Internet Protocol) was also developed and included with
most UNIX servers. TCP/IP allowed for cheap, ubiquitous connectivity. In
1988, the Defense department transferred control of the central
"backbone" of the Internet over to the National Science Foundation, who
in turn transferred control to commercial telecommunications interests
in 1995. In that year, there were 24 million Internet users. Today, the
Internet is a global distributed network connecting billions of people
around the world.

At the core of networked computing is the concept of protocol. A
computer protocol is a set of recommendations and rules that outline
specific technical standards. The protocols that govern much of the
Internet are contained in what are called RFC (Request For Comments)
documents. Called "the primary documentation of the Internet," these
technical memoranda detail the vast majority of standards and protocols
in use on the Internet today.

The RFCs are published by the Internet Engineering Task Force (IETF).
They are freely available and used predominantly by engineers who wish
to build hardware or software that meets common specifications. The IETF
is affiliated with the Internet Society, an altruistic, technocratic
organization that wishes "[t]o assure the open development, evolution
and use of the Internet for the benefit of all people throughout the
world." Other protocols are developed and maintained by other
organizations. For example, many of the protocols used on the World Wide
Web (a network within the Internet) are governed by the World Wide Web
Consortium (W3C). This international consortium was created in October
1994 to develop common protocols such as Hypertext Markup Language
(HTML) and Cascading Style Sheets. Scores of other protocols have been
created for a variety of other purposes by many different professional
societies and organizations. They are covered in more detail in chapter
4 [on "Institutionalization"].

Protocol is not a new word. Prior to its usage in computing, protocol
referred to any type of correct or proper behavior within a specific
system of conventions. It is an important concept in the area of social
etiquette as well as in the fields of diplomacy and international
relations. Etymologically it refers to a fly-leaf glued to the beginning
of a document, but in familiar usage the word came to mean any
introductory paper summarizing the key points of a diplomatic agreement
or treaty.

However, with the advent of digital computing, the term has taken on a
slightly different meaning. Now, protocols refer specifically to
standards governing the implementation of specific technologies. Like
their diplomatic predecessors, computer protocols establish the
essential points necessary to enact an agreed-upon standard of action.
Like their diplomatic predecessors, computer protocols are vetted out
between negotiating parties and then materialized in the real world by
large populations of participants (in one case citizens, and in the
other computer users). Yet instead of governing social or political
practices as did their diplomatic predecessors, computer protocols
govern how specific technologies are agreed to, adopted, implemented,
and ultimately used by people around the world. What was once a question
of consideration and sense is now a question of logic and physics.

To help understand the concept of computer protocols, consider the
analogy of the highway system. Many different combinations of roads are
available to a person driving from point A to point B. However, en route
one is compelled to stop at red lights, stay between the white lines,
follow a reasonably direct path, and so on. These conventional rules
that govern the set of possible behavior patterns within a heterogeneous
system are what computer scientists call protocol. Thus, protocol is a
technique for achieving voluntary regulation within a contingent
environment.

These regulations always operate at the level of coding--they encode
packets of information so they may be transported; they code documents
so they may be effectively parsed; they code communication so local
devices may effectively communicate with foreign devices. Protocols are
highly formal; that is, they encapsulate information inside a
technically defined wrapper, while remaining relatively indifferent to
the content of information contained within. Viewed as a whole, protocol
is a distributed management system that allows control to exist within a
heterogeneous material milieu.

It is common for contemporary critics to describe the Internet as an
unpredictable mass of data--rhizomatic and lacking central organization.
This position states that since new communication technologies are based
on the elimination of centralized command and hierarchical control, it
follows that the world is witnessing a general disappearance of control
as such.

This could not be further from the truth. I argue in this book that
protocol is how technological control exists after decentralization. The
"after" in my title refers to both the historical moment after
decentralization has come into existence, but also--and more
important--the historical phase after decentralization, that is, after
it is dead and gone, replaced as the supreme social management style by
the diagram of distribution.

[Excerpt reprinted with the permission of The MIT Press.]

http://mitpress.mit.edu/protocol
http://homepages.nyu.edu/~ag111/Protocol-contents.doc
http://www.amazon.com/exec/obidos/ASIN/0262072475

DISCUSSION