openFrameworks
comments 7

Kinect RGB+Depth Filmmaking [openFrameworks]

Golan 00 thumb

Golan Levin was invited by the FITC conference to answer a series of “Ask Me Anything” questions posted by Reddit visitors. At the STUDIO for Creative Inquiry, Golan’s video was created by Fellows James George and Jonathan Minard, artists-in-residence researching new forms of experimental 3D cinema. Their work explores the notion of “re-photography”, in which otherwise frozen moments in time may be visualized from new points of view.

Despite the sometimes wildly moving camera, the video was in fact shot with a stationary Kinect-like depth sensor coupled to a digital SLR video camera. To compose their shots, the filmmakers developed custom openFrameworks software that aligns and combines color video and depth data into a dynamic sculptural relief. In a process of “virtual cinematography”, James and Jonathan rephotographed Golan’s 3D likeness — selecting new angles, dollying, and zooming — to compose new perspectives on the data as if playing a video game. Fixed camerawork is thus transformed into a malleable and negotiable post-process, in which shots can be carefully recomposed to highlight and inflect different latent meanings.

This experiment developed out of concepts and collaborations born at Art && Code, a conference on 3D sensing and visualization organized by Golan’s laboratory, the STUDIO for Creative Inquiry at Carnegie Mellon University. Artist-hackers assembled to explore the artistic, technical, tactical and cultural potentials of low-cost depth sensors, such as the Kinect. As an outcome of the conference, James George, a creative coder interested in cinema, and Jonathan Minard, a documentary filmmaker interested in new-media technology, are now collaborating on the development of open-source tools and techniques for augmenting high-resolution video with depth information.

Watch the interview below.

Next up for James George and Jonathan Minard with support from STUDIO for Creative Inquiry, is our own Resonate festival taking place in Belgrade in March this year. The duo will record RGB+D interviews with event speakers including Nicholas Felton, Josh Nimoy, Jer Thorp, Greg J. Smith, Regine Debatty, Champagne Valentine, Niklas Roy, Benjamin Gaulon, Karsten Schmidt, alva noto & blixa bargeld and many more..

In addition, James with Alexander Porter will also host a workshop with the title “RGB+Depth filmmaking”. The team will demonstrate the entire filmmaking workflow; attaching the cameras together, capturing data, visualizing and rendering sequences. The participants will get a hands-on experience with these tools and leave the workshop with footage, software and knowledge of how to use this exciting new cinematic format in your own films.

There is still time to register for Resonate with only a few tickets remaining. More information here.

Other workshops include “Toxiclibs – a toolbox for computational design” with Karsten Schmidt, “Electronic Instant Photo Safari” workshop with Niklas Roy, “DMX/openFrameworks Lighting” workshop with Andreas Müller , “Recycling Entertainment System” workshop with Benjamin Gaulon & Martial Geoffre-Rouland and more yet to be announced.

FITCdeptheditordebug | STUDIO for Creative Inquiry | jamesgeorge.orgRGB+Depth Workshop at Resonate

See also ofxTimeline — a way to visually interact with values over time..

  • Joel Pryde

    Nice interview.  Curious what sort of depth sensor was used.  Whatever it is, it seems to have much higher fidelity than the Kinect.

  • http://www.jamesgeorge.org/ James George

    The depth sensor was the ASUS Xtion Pro: http://us.estore.asus.com/index.php?l=product_detail&p=3397 which is exactly the same resolution as the Kinect. The nicer details come from texturing the depth data with high a resolution video image.

  • Anonymous

    So, where should we be able to look at the code?

  • http://www.jamesgeorge.org/ James George

    The whole system is open source available here: https://github.com/obviousjim/RGBDepthExperiments at the moment we don’t have very good documentation, but for the Resonate workshop we’ll develop a clear explanation of how to use this technique in your own work!

  • Anonymous

    Holly macaroni! Even a very brief readme though would go a long way. Like run A, to get a, feed a into B to get b etc.

  • http://profile.yahoo.com/3KXZXEN46IMSLJCXX4RBZMA6DQ John

    I’ve tried the app, but get black screen, I think I may be missing something, ofx is not yet installed or ofxGamecamera or perhaps other dependencies, I’ve read & watched as much as I can but some things are not obvious :) There seems to be no mention of which lens is best suited to match the Kinect, or does it not matter? As the Calibration has scaling & warping ?

  • sha

    HI,
    i’ve my own post-production house for film making, and i’m very much curious and excited by this technology but i fear that its still not ready for use. It’s true that you can switch angles after film capturing but unfortunately kinect displays only what it sees. You only see the annoying flicking if you change the angle.

    Ok multiple cameras can help and i don’t know whether RGBDToolkit supports multiple kinect (please let me know). Event if supported i don’t know how easy it’ll be to map all the cameras to get a perfect 3D reconstruction. If you can let me know i can start my own film in this technology. Awaiting your response.

    Sha