Abstracting the Internet is a crawler program, which recurses through links on the internet, grabbing one byte from each page it visits. it outputs 100kb files, each constructed from data deriving from 100,000 websites.
Full Description
Abstracting the Internet is a crawler program, which recurses through links on the internet, grabbing one byte from each page it visits. it outputs 100kb files, each constructed from data deriving from 100,000 websites.
The goal of Abstracting the Internet is to non-symbolically represent the entire internet in a simple data form. this is accomplished by a process of large-scale pseudo-meaningful data collection: "The random number generators on today's computers are pseudo-random generators. There is always a cycle or a trend, however subtle. Pseudo-random generators cannot simulate natural random processes accurately for some applications, and can not reproduce certain random effects." therefore, given that all data collection based on large-scale systems (aka. the internet) is at best pseudo-random, we derive particular meaning not by the specific data collected, but instead by the data set from which the subset (of, for example, abstracting the internet) is collected: hence, pseudo-meaningful data -- an abstraction only reachable by classical computer processing. the output of abstracting the internet is a unique trope for the architecture of the internet.
ATI is the first in the series 'Formal Foundations of Art in the Network', an attempt by the artists to develop a conceptual framework for making art in the net.
Work metadata
- Year Created: 2002
- Submitted to ArtBase: Thursday Mar 14th, 2002
- Original Url: http://www.node99.org/denature/pub/ati/
- Permalink: http://www.node99.org/denature/pub/ati/
-
Work Credits:
- eidolon, creator
Take full advantage of the ArtBase by Becoming a Member
Artist Statement
the goal of abstracting the internet is to non-symbolically represent the entire internet in a simple data form. this is accomplished by a process of large-scale pseudo-meaningful data collection: "The random number generators on today's computers are pseudo-random generators. There is always a cycle or a trend, however subtle. Pseudo-random generators cannot simulate natural random processes accurately for some applications, and can not reproduce certain random effects." therefore, given that all data collection based on large-scale systems (aka. the internet) is at best pseudo-random, we derive particular meaning not by the specific data collected, but instead by the data set from which the subset (of, for example, abstracting the internet) is collected: hence, pseudo-meaningful data -- an abstraction only reachable by classical computer processing. the output of abstracting the internet is a unique trope for the architecture of the internet.