Dérive (2010)

This interactive installation invites to explore 3D reconstructions of urban and natural locations which are transformed according to live environmental data collected on the web. By their body's motion and position in front of the projection, visitors interact with a space which's appearance and level of recognition are determined by information about local meteorological and astronomical phenomena. In conjunction with its visualization, the data transmitted by remote environmental sensors is sonified.

Full Description

This interactive installation invites to explore 3D models of spaces that are transformed according to live environmental data collected on the Internet. A computer vision interface enables the public to interact with a representation which's appearance and level of recognition are determined by information on local meteorological and astronomical phenomena. In conjunction with its visualization, the data transmitted by remote environmental sensors is sonified and spatialized.

Dérive was initiated during the Géographies variablesvresidency program in April 2010. Inspired by the growing number of augmented mapping applications, LiDAR and photo-based 3D scanning, models of distant locations were created by the use of photogrammetry, geomatic data and 3D modeling. These 3D point clouds provide fixed XYZ coordinates that are used in a changing virtual space which ressembles a dynamic particle system. The display and positions of the points and of the mesh that connects them are determined by the following environmental information as a mean to evoke or simulate them.

Local time : Point size and brightness (relative to sunrise and sunset). Temperature : Point color. Cloudiness : Point saturation and brightness. Wind : Point displacement reflecting speed and direction. Visibility : Intensity of a depth of field effect and transparency. Humidity : Depth of field focus distance and point sharpness. Precipitation : Lines are drawn from the sky to the ground and points are destabilized. Lunar phase : Brightness of the lines.

The quadraphonic sound synthesis process uses this data and atmospheric pressure to generate an interactive sonic environment. The virtual camera's displacement in the 3D space is influenced by the public's movements and positions and by information on the location's environmental conditions.

Work metadata

Want to see more?
Take full advantage of the ArtBase by Becoming a Member
Related works

Comments

Leave a Comment