Abstracting the Internet (2002)

by Eidolon

Abstracting the Internet is a crawler program, which recurses through links on the internet, grabbing one byte from each page it visits. it outputs 100kb files, each constructed from data deriving from 100,000 websites.

Full Description

Abstracting the Internet is a crawler program, which recurses through links on the internet, grabbing one byte from each page it visits. it outputs 100kb files, each constructed from data deriving from 100,000 websites.

The goal of Abstracting the Internet is to non-symbolically represent the entire internet in a simple data form. this is accomplished by a process of large-scale pseudo-meaningful data collection: "The random number generators on today's computers are pseudo-random generators. There is always a cycle or a trend, however subtle. Pseudo-random generators cannot simulate natural random processes accurately for some applications, and can not reproduce certain random effects." therefore, given that all data collection based on large-scale systems (aka. the internet) is at best pseudo-random, we derive particular meaning not by the specific data collected, but instead by the data set from which the subset (of, for example, abstracting the internet) is collected: hence, pseudo-meaningful data -- an abstraction only reachable by classical computer processing. the output of abstracting the internet is a unique trope for the architecture of the internet.

ATI is the first in the series 'Formal Foundations of Art in the Network', an attempt by the artists to develop a conceptual framework for making art in the net.

Work metadata

Want to see more?
Take full advantage of the ArtBase by Becoming a Member
Artist Statement

the goal of abstracting the internet is to non-symbolically represent the entire internet in a simple data form. this is accomplished by a process of large-scale pseudo-meaningful data collection: "The random number generators on today's computers are pseudo-random generators. There is always a cycle or a trend, however subtle. Pseudo-random generators cannot simulate natural random processes accurately for some applications, and can not reproduce certain random effects." therefore, given that all data collection based on large-scale systems (aka. the internet) is at best pseudo-random, we derive particular meaning not by the specific data collected, but instead by the data set from which the subset (of, for example, abstracting the internet) is collected: hence, pseudo-meaningful data -- an abstraction only reachable by classical computer processing. the output of abstracting the internet is a unique trope for the architecture of the internet.

Related works

Featured in 1 Exhibition

Comments

This artwork has no comments. You should add one!
Leave a Comment