Panoptics is a immersive realtime interactive environment exploring the cross sections of realtime data visualization, our connection to the data being continually collected in contemporary society, and surveillance.
Panoptics is part realtime animation, part data visualization/data audification, and part philosophical question into the value and pitfalls of societal networked culture. The installation is ultimately a exploration of the ephemeral nature of data and the virtual world that we are creating for ourselves. As new technology comes into existence, how do we as a culture understand the implications of that technology? How do we form a dialogue and understanding around the networks we have built? How are we as participants in the internet commodified, marketed, and sold?
Upon entering the space the viewer sees a single sculpture lit by a single bulb from above. The sculpture is a beige three dimensional print of five pentagons warped, twisted, and folded on top of each other. The space is empty with the exception of the sculpture and three security cameras mounted in the top corners of the room. Moving into the second room of the installation, the viewer is immersed in a five screen projection. The screens are placed in a circular formation loosely forming a pentagon. Each screen is semitransparent so light catches the screen but can also be easily seen beyond the screen; projecting images onto the walls behind the screens. The screens are illuminated by three projectors with a three dimensional model build by a particle system that is rapidly changing shape, moving and reacting to the environment. The third and final room of the installation has five monitors four of which are live video feeds from the front room while the fifth is presenting realtime data feeds that are being extracted from the exhibition space via the security cameras.
All elements of the installation are networked. From the security cameras in the front room, live video feeds are sent by wire to surveillance monitors. The signal is passed from the four monitors into an external video card that digitizes the signal and that digital video feed is brought into max/msp. In max/msp I am using the computer vision library to track thirty two points divided between the four monitors. I am then taking an average of those thirty two points and sending them wirelessly via the udpsend object in max/msp to a computer in the middle room. Those extracted numbers are used to drive the video feed of the three dimensional model in the second room. In this way each room is networked and passing information throughout the installation. Feedback is an integral component to the Installation. The central figure of the work, the particle system in the second room, is an active feedback system. Using a field of mapped particles from a matrix of noise using the jit.gen~ object in max/msp, the particles are manipulated with some simple math and are then fed back recursively into the same matrix. The entire system is also set up to take data in, translate that data into another sensorial medium then feed that translated data into another system that will translate and feed into the initial system.
- Year Created: 2013
- Submitted to ArtBase: Tuesday Jul 30th, 2013
- Original Url: https://vimeo.com/64281972
- ablanton, primary creator
Take full advantage of the ArtBase by Becoming a Member