Some time ago Zach Lieberman posted code on the oF forums which allows you to show the pixels directly below the openFrameworks app window in a texture. This little app Satoru Higa (4nchor5 la6/rhizomatiks) uses the same method to create a music sequencer. Each time app is ran, the transparent window uses the desktop as the source for sound. The colour and the position of the colours below decides the pitch, tone and tempo. As you drag the window around the desktop or add additional ones you can further create compositions.
In the video below Satoru also shows what happens when you load a webpage underneath and use that as a source. Watch below.
To see more experiments by Satoru, see his GitHub.
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- Sound Yeah [iPad, Sound, openFrameworks] Sound Yeah is the latest application currently in development by Henry Chu, the same name behind Squiggle we mentioned few weeks back. I asked Henry about the new app and this is what he said: In the beginning I wanted to create a digital toy to record and play sound, with some creative controls like scratching. After playing the prototype for a while, I added more functions. In edit mode I can adjust the volume, attack, release, playing speed. Also there are some utility tools like copy, paste and trim, helps you to manage the clips. In play mode, I can tap to play a clip, or I can drag in circles to scratch the clip. If I wind the clip and drag away, it will loop the clip at the speed it was dragged. Henry used openFrameworks to build Sound Yeah. He started off from the default audio-in and audio-out example available in the examples folder of oF. The sound is generated from the sound buffer, so its a time-based approach, he writes. In addition he also wrote some classes for the UI (which we absolutely love) as well as a DIY cable to split the microphone and headphone so he can connect to other sound source using line-in, rather than the build-in microphone. See video below of him perform with the app. Henry hasn't submitted Sound Yeah to Apple just yet but expects this to happen very soon. In the meantime, you can follow the progress by following Henry on vimeo. We will, of course, post as soon as the app is available. UPDATE 05.08.2010 // Added 2 new videos. See below. Platform: iPad Version: 1.0 Cost: $4.99 Developer: […]
- Shadow, Glare [Mac] Devised by Erin Shirreff and programmed by Seth Erickson, Shadow, Glare is an exploration into how light might begin to change the nature of your "immaterial" computer desktop screen creating a subtle but effective illusion of passing light as you immerse into your daily work.... Shadow, Glare explores such experiential disruptions through a subtle visual intervention: Without altering the computer’s normal operations, the program renders a morphing series of translucent forms that seem to float between the screen’s real surface and the immaterial desktop. This simulation can blend unobtrusively with any actual shadows that happen to be cast on the screen; users may continue to work or browse while only peripherally aware that the program is running. But the slowly evolving forms can also occlude the desktop and interrupt the user’s focus. (canopycanopycanopy.com ISSUE 9) "Time evaporates, and while at points I’m engaged, for the most part I’m folded into the experience, while somehow still scanning its surface." Erin Shirreff Download (via […]
- Sound of Honda – Ayrton Senna’s Fastest F1 Lap (1989) in Light and Sound This project, a collaboration between Dentsu, Honda Motor and Rhizomatiks brings back Senna’s engine sound from that lap 24 years ago in the form of an installation set on the original Suzuka circuit that uses light and […]
- Circadia by Kurt Bieg – A constellation of colours seeking a rhythm – iOS Created by Kurt Bieg, Circadia is a very simple, very tricky and exceptionally engaging little game created using openFrameworks. Each level in Circadia is a constellation of musical colors seeking a rhythm. To solve the puzzle you must figure out the sequence. Just get the color bursts to converge on the white dot at the same time. It starts off very simple but as you progress it becomes increasingly complex. Trying to sync both the sound and visual composions is much more tricky than you could imagine. The game features "100 unique ear bending puzzles meticulously designed to challenge your senses". Warning from the creator: "Enjoy with caution" Platform: iPad/iPhone Version: 1.0 Cost: $0.99 Developer: Kurt […]
- ofxTimeline [openFrameworks] ofxTimeline is a new add-on for openFrameworks by James George aka obviousjim. The addon allows you to treat content in your oF apps as you would in After Effects or other similar applications that are driven by keyframe animation and linear timelines. You are able to manipulate (say) object values, or any variable using a animation timeline, inserting keyframes, all in realtime. See video below. Also the code is available here: https://github.com/Flightphase/ofxTimeline. Standalone app (MacOSX): github.com/downloads/Flightphase/ofxTimeline/timelineSimple.zip you'll need these in your addons/ along side ofxTimeline ofxTween ( https://github.com/arturoc/ofxTween ) ofxRange ( https://github.com/Flightphase/ofxRange ) ofxTextInputField ( https://github.com/Flightphase/ofxTextInputField ) jamesgeorge.org Previously: Fragments of time and space recorded with Kinect+SLR on NYC... Voyagers - permanent installation for @NMMGreenwhich […]
- Reactor for Awareness in Motion (RAM) by YCAM – Download C++ creative coding toolkit to create realtime feedback environments for dancers is now available for download. Available both as open source download and applications for Mac and Windows to choreograph or rehearse previously programmed […]
- Scramble Suit – Face Tracking [openFrameworks] It all kicked off about 2 months ago when Kyle McDonald posted 'FaceOSC', a tool for prototyping face-based interaction. The oF wrapper Kyle created is built on non-commercial open source FaceTracker code from Jason Saragih. Kyle included an example oF app with the ofxFaceTracker addon for openFrameworks which can be downloaded here. Then few days ago, Arturo Castro carried on the work by exploring different face substitution techniques. Using the same library, mesh obtained from a photo is matched to his own face in the video. Applying some color interpolation algorithms from Kevin Atkinson's image clone code: methodart.blogspot.com/ gives it the blending effect that can be seen in the final footage. Now only few minutes ago, Kyle posted this video, named "Scramble Suit" inspired by fictional technology from Philip K. Dick's 1977 novel, "A Scanner Darkly". It's effectively a cloak that hides the identify of the wearer by making it impossible to describe or remember them. Kyle points to an excerpt here. Very exciting bunch of face tracking experiments. Looking forward to see what come next... Videos below.. Scramble Suit from Kyle McDonald on Vimeo. Faces from arturo castro on Vimeo. FaceOSC from Kyle McDonald on […]
Posted on: 01/11/2012
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google