Leap Motion has arrived and even though we have all heard about the difficulties and challenges of using Leap to for precise control, the device nonetheless has created a domain for interacting with digital in a refreshing new way. If one tries to apply same principles we use mouse input for it will most definitely fail. Experiments like controlling the desktop, browser, etc most definitely have shortcomings over the more traditional mouse and keyboard input — don’t forget the content we use daily on the screen has been designed for mouse and keyboard. Once we take a step back and look at new ways and opportunities Leap Motion provides, we seek out new ways to control and design new type of content that may not at all be utilitarian, instead it may offer a brief glimpse into where human-computer interaction may be heading.
Here are a few ‘experiments’ with Leap Motion we collected over the last few weeks – kicking off with those you can find one the AirSpace store. For these new contexts, you be the judge whether Leap’s accuracy is important at all. If you are working on something cool you think belongs on this list, please drop us an mail and we’ll add it. Enjoy.
Gravilux for Leap Motion adapts the best-selling iOS musical star field app to a “purely natural movement experience”: when you point fingers at the screen they all pull on the stars, when you punch with your fist, antigravity repels. Scott Snibbe, the company’s founder, says, “With Gravilux for Leap Motion, you literally have the universe at your fingertips.”
OscilloScoop for Leap Motion allows you to make musical groves by sculpting spinning crowns with your hands. Created by DJ and designer Lukas Girling, the app mashes up turntables, video games, and electronic music into a real-time audiovisual performance. With fingers pointed directly at the screen, people sculpt spinning ‘crowns’—one for tone and one for a digital audio filter. People can also open up their hands and their fingertips directly draw waveforms for the sounds to follow.
Created by Robert Hodgin, Flocking is a glimmering school of fish will flock to your hand and follow your every natural movement. Move your hand slowly to the right, and they’ll swim to the right. Move left, and they’ll flutter to the left. Move forward and back, they’ll follow. Wave your hand violently and they’ll scatter. The underwater world moves smoothly and magically along with your every motion.
Created by Eddie Lee, Lotus consists of a collection of interactive musical toys which harness the power of the Leap Motion. Each musical toy features a fresh, unique way for the user to dynamically create music using gesture controls. Coded using C/C++. OpenGL for graphics, Fmod Studio for sound and Gamemonkey for scripting.
Ryo Fujimoto is Beatboxer / Electronic Musician. His performance is called ” Humanelectro ” where he controls effects with the right hand, the left hand is synth sounds on the spot. The synth sounds is generated by voice and Beatboxing on the spot.
Ryo Fujimoto | Download Not Available
Created by Je Seok Koo, Strummer Player for Leap Motion is a digital strumming application built with Processing. You can specify a number of strings, play 4 instruments, adjust attack, decay, sustain and release and the app also include chord arranger. You can also control a MIDI device and if all of this gets too much you can easily switch to mouse control.
Pepper’s Ghost and Leap Motion
Created by Adrien Mondot (am-cb.net), this is a first attempt to create a small Pepper’s ghost effect with the Leap Motion and eMotion, a tool for creating interactive motions of objects for live visual performances we wrote about few days ago. The method uses a familiar technique to create an illusion of manipulating digital surfaces and particles in mid-air.
Leap Motion Ball Maze
Here’s the Ball Maze being controlled by the Leap Motion controller. David Thomasson is using the ‘Leap Motion For Processing’ library from voidplus in Processing on the laptop, and it’s talking to Arduino in the maze.
You Tube | Download Not Available
Leap Motion Robot Control
Precision is always going to be an issue with Leap but Brian Harms is experimenting with real-time control of a Staubli TX40 robot arm using the Leap Motion controller. There is some delay between one and the other but nonetheless it is wonderful to watch human and the machine mimicking each other. At some points in the video you do wonder who is actually following who.
Realtime Interactive Laser ILDA Tests
Finally, how could we forget Memo’s recent experiments with the ILDA controllable laser projector and openframeworks. This time experiment includes a 4W RGB laser at 90Kpps, finger tracking with leap motion, etherdream dac, openframeworks, ofxEtherdream, ofxIlda, ofxLeapMotion and a DJI Fantom thrown in for good measure.