Editorial Notes: Big Data in scale.ucsd.edu and YLEM

Abstract
Artists confront the problems of data density and range in the aesthetic
of the sublime. Together with an introduction by Brett Stalbaum, these
essays by Lisa Jevbratt, Andrea Polli and Christina McPhee were first
published in print for YLEM Journal, Volume 24
Number 6, May-June 2004 (McPhee) and Volume 24 Number 8, July-August
2004 (Jevbratt & Polli), at the suggestion of Loren Means. The YLEM
Journal is the bimonthly publication of YLEM, a twenty-three-year-old
organization dedicated to the nexus of art, science, & technology. For
more information on joining YLEM and to view the YLEM Journal online,
visit www.ylem.org. The articles are co-published online in Scale,
scale.uscd.edu (Vol. 1, Issue 6+7).

Editorial notes

Moore's Law, Gordon Moore's famous prediction that processing speeds
double approximately every 18 months, has proven to be so prescient that
it long ago rose past the status of provocative futurist claim to the
level of pedestrian cultural assumption. But what has not yet become an
accepted cultural assumption is that Moore's law is at least matched,
and possibly exceeded by the exponential growth of data to be processed.
The relationship between humankind's ability to collect data and to
process and understand data is co-exponential: both are exploding. Data
sets from genomics, astrophysics, geography, geology, particle physics,
climatology, meteorology, nanotechnology, materials science and even the
search for ET are producing quantities of data that challenge the
technical limits of super computers, distributed computing, grid
computing, and superscalar simulation techniques. Even given Moore's
law, optical networks, and cheap mass storage, the problem of big data
is nevertheless looming larger as our ability to collect data actively
competes with our ability to process and digest it.

Computation has already become a nominal, if not tacit assumption in
contemporary art practice due to the ubiquitous implementation of
computer and communications technologies in all aspects of our emerging
global culture. How does big data impinge on the present generation of
representational artists who operate under the assumption of a rich
computational environment? And what are the emerging aesthetic and
conceptual parameters that impinge on the practice of artists who
consciously recognize data and coding as the primary expressions of an
art practice wherein the notions of "representation" are not limited to
narrowly prescribed assumptions regarding a specifically graphical or
interactive interface and networked distribution as the primary cultural
operatives between artist and audience? What other questions arise in an
environment where we live in a constant streaming wash of data, and what
are the issues surrounding how artists might help interpret both
cultural and scientific phenomena?

Lev Manovich raises a particularly interesting issue in his 2002 essay
titled "The Anti-Sublime Ideal in Data Art". In it, Manovich identified
an aesthetic approach to big data that seeks to interpret large data
sets on much the same terms as designers and scientists seek to analyze
data; a pursuit which he describes as the exact opposite goal of
romantic art. "If Romantic artists thought of certain phenomena and
effects as un-representable, as something which goes beyond the limits
of human senses and reason, data visualization artists aim at precisely
the opposite: to map such phenomena into a representation whose scale is
comparable to the scales of human perception and cognition." He goes on
to form a critique of such practice, and raises the question of "How new
media can represent the ambiguity, the otherness, the
multi-dimensionality of our experience… In short, rather than trying
hard to pursue the anti-sublime ideal, data visualization artists should
also not forget that art has the unique license to portray human
subjectivity

Comments

, Jean Hess

Brett Stalbaum wrote:

> Abstract
> Artists confront the problems of data density and range in the
> aesthetic
> of the sublime. Together with an introduction by Brett Stalbaum,
> these
> essays by Lisa Jevbratt, Andrea Polli and Christina McPhee were
> first
> published in print for YLEM Journal, Volume 24
> Number 6, May-June 2004 (McPhee) and Volume 24 Number 8,
> July-August
> 2004 (Jevbratt & Polli), at the suggestion of Loren Means. The YLEM
> Journal is the bimonthly publication of YLEM, a
> twenty-three-year-old
> organization dedicated to the nexus of art, science, & technology.
> For
> more information on joining YLEM and to view the YLEM Journal
> online,
> visit www.ylem.org. The articles are co-published online in Scale,
> scale.uscd.edu (Vol. 1, Issue 6+7).
>
> Editorial notes
>
> Moore's Law, Gordon Moore's famous prediction that processing
> speeds
> double approximately every 18 months, has proven to be so prescient
> that
> it long ago rose past the status of provocative futurist claim to
> the
> level of pedestrian cultural assumption. But what has not yet become
> an
> accepted cultural assumption is that Moore's law is at least
> matched,
> and possibly exceeded by the exponential growth of data to be
> processed.
> The relationship between humankind's ability to collect data and to
> process and understand data is co-exponential: both are exploding.
> Data
> sets from genomics, astrophysics, geography, geology, particle
> physics,
> climatology, meteorology, nanotechnology, materials science and even
> the
> search for ET are producing quantities of data that challenge the
> technical limits of super computers, distributed computing, grid
> computing, and superscalar simulation techniques. Even given
> Moore's
> law, optical networks, and cheap mass storage, the problem of big
> data
> is nevertheless looming larger as our ability to collect data
> actively
> competes with our ability to process and digest it.
>
> Computation has already become a nominal, if not tacit assumption
> in
> contemporary art practice due to the ubiquitous implementation of
> computer and communications technologies in all aspects of our
> emerging
> global culture. How does big data impinge on the present generation
> of
> representational artists who operate under the assumption of a rich
> computational environment? And what are the emerging aesthetic and
> conceptual parameters that impinge on the practice of artists who
> consciously recognize data and coding as the primary expressions of
> an
> art practice wherein the notions of "representation" are not limited
> to
> narrowly prescribed assumptions regarding a specifically graphical
> or
> interactive interface and networked distribution as the primary
> cultural
> operatives between artist and audience? What other questions arise in
> an
> environment where we live in a constant streaming wash of data, and
> what
> are the issues surrounding how artists might help interpret both
> cultural and scientific phenomena?
>
> Lev Manovich raises a particularly interesting issue in his 2002
> essay
> titled "The Anti-Sublime Ideal in Data Art". In it, Manovich
> identified
> an aesthetic approach to big data that seeks to interpret large
> data
> sets on much the same terms as designers and scientists seek to
> analyze
> data; a pursuit which he describes as the exact opposite goal of
> romantic art. "If Romantic artists thought of certain phenomena and
> effects as un-representable, as something which goes beyond the
> limits
> of human senses and reason, data visualization artists aim at
> precisely
> the opposite: to map such phenomena into a representation whose scale
> is
> comparable to the scales of human perception and cognition." He goes
> on
> to form a critique of such practice, and raises the question of "How
> new
> media can represent the ambiguity, the otherness, the
> multi-dimensionality of our experience… In short, rather than
> trying
> hard to pursue the anti-sublime ideal, data visualization artists
> should
> also not forget that art has the unique license to portray human
> subjectivity