Please disable AdBlock. CAN is an ad-supported site that takes hundreds of hours and thousands of dollars to sustain. Read More.

OUT NOW:
HOLO 1

Emerging trajectories in art, science, and technology.

226 pages of conversation, research, opinion, analysis. Step into artists' studios and workshops to discover the faces, personalities, and processes behind important work. Learn more!

HOLO is brought to you by the people behind CreativeApplications.Net

Touch Vision Interface [openFrameworks, Arduino, Android]

Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR.

I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience.

The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument.

The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture.

Created using opencv-android, openframeworks and python/arduino for the led matrix.

Touch Vision Interface

(Thanks Peter)

Posted on: 15/09/2011

Posted in: Android, Arduino, openFrameworks

Post tags:

    • Lars Herbst

      hmm.. seems ok but.. little bit trivial.. now making the screens look like a window to a 3D world please.
      and take the phone as the viewpoint.. that schould not be so difficult..

    • Jonathan A.

      Seriously, everyone who says this kind of stuff is trivial doesn’t realize that it’s just a step in an awesome direction. This plus hundreds of other “trivial” technologies will shape tomorrow.

    • Jonathan A.

      Seriously, everyone who says this kind of stuff is trivial doesn’t realize that it’s just a step in an awesome direction. This plus hundreds of other “trivial” technologies will shape tomorrow.

    • Chris L

      You wanna look at the real world through camera? Does this need a rotating camera?