Kimchi and Chips have just finished exhibiting their latest project at FutureEverything in Manchester – Lit Tree. Through the use of video projection, a tree is augmented enabling the presentation of volumetric light patterns using itʼs own leaves as voxels (3D pixels). Kimchi and Chips developed their own structured light system to scan in the location of every pixel in 3D, allowing a cloud of scattered projector pixels to be used as 3D Voxels. The software was written using c++, openFrameworks, XCode and Visual Studio. As a person places their hand above a plinth, their hand is scanned in 3D using a Kinect. Their realtime 3D shape is reflected inside the tree allowing them to select a volume of the tree to highlight.
The tree invites viewers with a choreographed cloud of light that can respond visitors motion. As visitors approach, they can explore the immediate and cryptic nature of this reaction. The tree can form gestures in this way, and can in turn detect the gestures of its visitors. By applying a superficial layer of immediate interaction to the tree, can people better appreciate the long term invisible interaction that they share with it?
Material / Hardware (At Futureeverything):
Bamboo tree, 2 x High resolution webcams (reading structured light patterns), 2 x Video projectors, Microsoft kinect, Par 16 ‘Birdie’ light w/Black wrap, Wooden plinth, Mac Mini 2010
FutureEverything 2011, Manchester UK , May 11th-May 22nd
More about the process on their blog.