Created along with the english indie band The xx, Matt Mets, Aramique Krauthamer and Kyle McDonald created an exhibit that incorporates the band’s music into a room full of stepper motor controlled speakers that pivot to follow listeners as they move through the space.
Missing is part of Coexist, an exhibition cycle at the Sonos Studio in Los Angeles that attempts to explore the “relationship between man and machine.” The Missing part of that exhibit, on display through Dec. 23, uses 50 speakers, two hacked Kinect 3-D cameras, and a whole lot of code and robotics to create an environment where people are moving through the music and interacting with the speakers, without even trying. – writes Wired.
Development took about 6 weeks from accepted pitch to opening night. Matt Mets developed a system that uses stepper motors with potentiometers (in place of servos) in combination with a custom pcb that is a modified version of an old open source makerbot stepper driver (Matt used to work for makerbot). After a lot of laser cutting and machining used to fabricate the mounts, the pcb on each mount is powered and controlled by a single ethernet cable which is daisy chained along the ceiling of the installation. These are controlled by a single ethernet cable connected to a teensy plugged into the Mac Mini which is also connected to two Kinects that follow the visitor through the space.
The system is powered by a single openFrameworks app. All the audio is sent from Ableton Live, and controlled from OF using local midi (e.g., for restarting the piece when people walk in, and turning it down if no one is around). The github repo the team has made available contains everything from pcb schematics to arduino code to the OF code and can be downloaded here.
Kyle also describes some clever tricks involving orthographic reprojection of the point cloud, sampling the accelerometer to help align the data from multiple Kinects, generating binned “heatmap” style images from the reprojected points and doing blob detection on that image. The system is also doing dynamic renderings of the space using simple speaker models loaded from Rhino.The original layout for all the speakers was done using rhino/python scripting, but the final heights were actually based on a call to ofRandom() with a seed which, Kyle tells us, made him especially happy.
Posted on: 29/11/2012