<p>With the <a href="http://web.media.mit.edu/~dmerrill/mas963/">Facial Control for Electric Guitar</a> project, <a href="http://web.media.mit.edu/~dmerrill/">David Merrill</a> mapped the output of a real-time face-tracker onto the parameters of an audio effects processor. </p>
A modified real-time head-tracker communicates via a TCP/IP socket connection to a custom server program. The server manages the mapping of sensed gesture onto control messages, which it sends to a guitar effects processor via MIDI messages.
Pupil positions and sizes are tracked using a difference images, and eyes/eyebrows are tracked using templates. Detection of head nods, head shakes, and eye blinks is also implemented.
Originally posted on Eyebeam reBlog by Rhizome