In what ways can a human body control continuously-generated synthetic sound?

That's the question that drove Alex Hornbake and I to create Sound Chamber, an exploratory audio-visual installation presented at Thoughtworks New York in 2013.

This project was one of the outcomes of Hardware Hack Lab, a co-working space for technologists and artists run every Wednesday here in New York.

In it we took a close look at full-body interaction using commercial depth devices such as the Kinect or Asus Xtion. We tied these devices up to a custom-designed sound synthesis environment built with SuperCollider, and out also to video screens placed around the space.

Alex & I considered this project a creative exploration, and the installation an unfinished demo - an outcome of the exploration. As such, all the code is up on Github along with instructions to replicate. We feel pretty good about adding this mode of interaction to our toolbox and plan to experiment more in the new year.