book excerpt: "Protocol: How Control Exists After Decentralization"


i wanted to post some excerpts from my new book which i'm very excited
about.. The book is about computer networks and the concept of
"protocol" that ties the networks together. i also have chapters on net
art, tactical media, and hackers. more to come in a couple weeks!



+ + +

"Protocol: How Control Exists After Decentralization"
by Alexander R. Galloway
The MIT Press (March, 2004), 248 pages, ISBN 0262072475

book homepage:
table of contents:
amazon page:

Excerpt from the "Introduction":

This book is about a diagram, a technology, and a management style. The
diagram is the distributed network, a structural form without center
that resembles a web or meshwork. The technology is the digital
computer, an abstract machine able to perform the work of any other
machine (provided it can be described logically). The management style
is protocol, the principle of organization native to computers in
distributed networks. All three come together to define a new apparatus
of control that has achieved importance at the start of the new

Much work has been done recently on theorizing the present historical
moment and on offering periodizations to explain its historical
trajectory. I am particularly inspired by five pages from Gilles
Deleuze, "Postscript on Control Societies," which begin to define a
chronological period after the modern age that is founded neither on the
central control of the sovereign nor on the decentralized control of the
prison or the factory. My book aims to flesh out the specificity of this
third historical wave by focusing on the controlling computer
technologies native to it.

How would control exist after decentralization? In former times control
was a little easier to explain. In what Michel Foucault called the
sovereign societies of the classical era, characterized by centralized
power and sovereign fiat, control existed as an extension of the word
and deed of the master, assisted by violence and other coercive factors.
Later, the disciplinary societies of the modern era took hold, replacing
violence with more bureaucratic forms of command and control.

Deleuze has extended this periodization into the present day by
suggesting that after the disciplinary societies come the societies of
control. Deleuze believed that there exist wholly new technologies
concurrent with the societies of control. "The old sovereign societies
worked with simple machines, levers, pulleys, clocks," he writes, "but
recent disciplinary societies were equipped with thermodynamic
machines… control societies operate with a third generation of
machines, with information technology and computers." Just as Marx
rooted his economic theory in a strict analysis of the factory's
productive machinery, Deleuze heralds the coming productive power of
computers to explain the sociopolitical logics of our own age.

According to Critical Art Ensemble (CAE), the shift from disciplinary
societies to control societies goes something like this:

"Before computerized information management, the heart of
institutional command and control was easy to locate. In fact, the
conspicuous appearance of the halls of power was used by regimes to
maintain their hegemony…. Even though the monuments of power still
stand, visibly present in stable locations, the agency that
maintains power is neither visible nor stable. Power no longer
permanently resides in these monuments, and command and control now
move about as desired."

The most extensive "computerized information management" system existing
today is the Internet. The Internet is a global distributed computer
network. It has its roots in the American academic and military culture
of the 1950s and 1960s. In the late 1950s, in response to the Soviet
Sputnik launch and other fears connected to the Cold War, Paul Baran at
the Rand Corporation decided to create a computer network that was
independent of centralized command and control, and would thus be able
to withstand a nuclear attack that targets such centralized hubs. In
August 1964, he published an eleven-volume memorandum for the Rand
Corporation outlining his research.

Baran's network was based on a technology called packet-switching that
allows messages to break themselves apart into small fragments. Each
fragment, or packet, is able to find its own way to its destination.
Once there, the packets reassemble to create the original message. In
1969, the Advanced Research Projects Agency (ARPA) at the U.S.
Department of Defense started the ARPAnet, the first network to use
Baran's packet-switching technology. The ARPAnet allowed academics to
share resources and transfer files. In its early years, the ARPAnet
(later renamed DARPAnet) existed unnoticed by the outside world, with
only a few hundred participating computers, or "hosts."

All addressing for this network was maintained by a single machine
located at the Stanford Research Institute in Menlo Park, California. By
1984 the network had grown larger. Paul Mockapetris invented a new
addressing scheme, this one decentralized, called the Domain Name System

The computers had changed also. By the late 1970s and early 1980s
personal computers were coming to market and appearing in homes and
offices. In 1977, researchers at Berkeley released the highly
influential "BSD" flavor of the UNIX operating system, which was
available to other institutions at virtually no cost. With the help of
BSD, UNIX would become the most important computer operating system of
the 1980s.

In the early 1980s, the suite of protocols known as TCP/IP (Transmission
Control Protocol/Internet Protocol) was also developed and included with
most UNIX servers. TCP/IP allowed for cheap, ubiquitous connectivity. In
1988, the Defense department transferred control of the central
"backbone" of the Internet over to the National Science Foundation, who
in turn transferred control to commercial telecommunications interests
in 1995. In that year, there were 24 million Internet users. Today, the
Internet is a global distributed network connecting billions of people
around the world.

At the core of networked computing is the concept of protocol. A
computer protocol is a set of recommendations and rules that outline
specific technical standards. The protocols that govern much of the
Internet are contained in what are called RFC (Request For Comments)
documents. Called "the primary documentation of the Internet," these
technical memoranda detail the vast majority of standards and protocols
in use on the Internet today.

The RFCs are published by the Internet Engineering Task Force (IETF).
They are freely available and used predominantly by engineers who wish
to build hardware or software that meets common specifications. The IETF
is affiliated with the Internet Society, an altruistic, technocratic
organization that wishes "[t]o assure the open development, evolution
and use of the Internet for the benefit of all people throughout the
world." Other protocols are developed and maintained by other
organizations. For example, many of the protocols used on the World Wide
Web (a network within the Internet) are governed by the World Wide Web
Consortium (W3C). This international consortium was created in October
1994 to develop common protocols such as Hypertext Markup Language
(HTML) and Cascading Style Sheets. Scores of other protocols have been
created for a variety of other purposes by many different professional
societies and organizations. They are covered in more detail in chapter
4 [on "Institutionalization"].

Protocol is not a new word. Prior to its usage in computing, protocol
referred to any type of correct or proper behavior within a specific
system of conventions. It is an important concept in the area of social
etiquette as well as in the fields of diplomacy and international
relations. Etymologically it refers to a fly-leaf glued to the beginning
of a document, but in familiar usage the word came to mean any
introductory paper summarizing the key points of a diplomatic agreement
or treaty.

However, with the advent of digital computing, the term has taken on a
slightly different meaning. Now, protocols refer specifically to
standards governing the implementation of specific technologies. Like
their diplomatic predecessors, computer protocols establish the
essential points necessary to enact an agreed-upon standard of action.
Like their diplomatic predecessors, computer protocols are vetted out
between negotiating parties and then materialized in the real world by
large populations of participants (in one case citizens, and in the
other computer users). Yet instead of governing social or political
practices as did their diplomatic predecessors, computer protocols
govern how specific technologies are agreed to, adopted, implemented,
and ultimately used by people around the world. What was once a question
of consideration and sense is now a question of logic and physics.

To help understand the concept of computer protocols, consider the
analogy of the highway system. Many different combinations of roads are
available to a person driving from point A to point B. However, en route
one is compelled to stop at red lights, stay between the white lines,
follow a reasonably direct path, and so on. These conventional rules
that govern the set of possible behavior patterns within a heterogeneous
system are what computer scientists call protocol. Thus, protocol is a
technique for achieving voluntary regulation within a contingent

These regulations always operate at the level of coding–they encode
packets of information so they may be transported; they code documents
so they may be effectively parsed; they code communication so local
devices may effectively communicate with foreign devices. Protocols are
highly formal; that is, they encapsulate information inside a
technically defined wrapper, while remaining relatively indifferent to
the content of information contained within. Viewed as a whole, protocol
is a distributed management system that allows control to exist within a
heterogeneous material milieu.

It is common for contemporary critics to describe the Internet as an
unpredictable mass of data–rhizomatic and lacking central organization.
This position states that since new communication technologies are based
on the elimination of centralized command and hierarchical control, it
follows that the world is witnessing a general disappearance of control
as such.

This could not be further from the truth. I argue in this book that
protocol is how technological control exists after decentralization. The
"after" in my title refers to both the historical moment after
decentralization has come into existence, but also–and more
important–the historical phase after decentralization, that is, after
it is dead and gone, replaced as the supreme social management style by
the diagram of distribution.

[Excerpt reprinted with the permission of The MIT Press.]