Firewall is an interactive installation created by Aaron Sherwood and Mike Allison. A stretched sheet of spandex acts as a membrane interface sensitive to depth that people can push into and manipulate expressive visuals.
The piece was made using Processing, Max/MSP, Arduino and a Kinect. Like the Cloud Pink by Everyware, the Kinect measures the average depth of the spandex from the frame it is mounted on. If the spandex is not being pressed into nothing happens. When someone presses into it the visuals react around where the person presses, and the music is triggered.
An algorithm created with Max allows the music to speed up and slow down and get louder and softer, based on the depth. This provides a very expressive musical playing experience, even for people who have never played music before. A switch is built into the frame which toggles between two modes. The second mode is a little more aggressive than the first.
See also Thicket.
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- Kinect Cloth Simulations [Processing] For the last month Victor Martins has been exploring what is possible with Microsoft Kinect by simulating cloth behaviour using Processing. It all started off with simple surface with plastic-type shader applied to it. These experiments quickly progressed to trying shadows and gamma correction control, tweeking parameters and finally rendering fur on top of cloth skin. The results are very impressive videos of Victor interacting in real time with quite realistic cloth like surfaces. Whilst this is still work in progress and we expect to see much more from Victor, please enjoy the collection of clips, ordered by oldest to newest. Previously: Graffiti Viscosity [Processing] #GML by @victamin ... Kinect - OpenSource [News] - amazing work created within a few ... Kinect - One Week Later [Processing, oF, Cinder, MaxMSP] - Now ... ...and many more Kinect examples collected by us here. (hmm) Kinect Cloth Shaper from Victor Martins on Vimeo.Cloth simulation with kinect depthmap (hmm) Kinect Cloth Skin from Victor Martins on Vimeo.Cloth simulation and kinect #2 (hmm) Kinect Cloth Skin #2 from Victor Martins on Vimeo.Now with shadows and gamma correction control. (hmm) Kinect Cloth Skin #3 from Victor Martins on Vimeo.Changing parameters (and another try for better encoding quality). Furry furry from Victor Martins on Vimeo.Rendering fur on top of cloth […]
- Cloud Pink by Everyware – Another world above… If you recall "Soak, Dye in Light" project by everyware, Cloud Pink is the latest iteration, taking on the analogy of cloud rather than paint soaked fabric. The installation invites participants to "touch the pink clouds" drifting on a giant fabric screen suspended in the air. Lying down on a hill with your pupils filled with the endless blue sky, perspective of your eyesight suddenly gets distorted and clouds drift at the tip of your nose. You stretch your arms up to the sky to touch the clouds but can’t reach. Another world right above your head, clouds. Created using Processing, GLSL and 2 x Kinects and Projectors. The images below are from the installation at the Savina Gallery, Seoul, Korea. Also included at the bottom of the post is the video recording from Gallery Seoul xii, where installation takes a somewhat different shape. Cloud Pink | Previously Soak, Dye in Light Everyware is a creative computing group formed in 2007 by Hyunwoo Bang and Yunsil Heo. Hyunwoo Bang is an assistant professor of the New Media Lab. in the School of Mechanical and Aerospace Engineering in Seoul National University. Yunsil Heo has an MFA from the Department of Design/Media Arts at the University of California, Los Angeles, and BA degrees from the College of Fine Arts and College of Humanities at Seoul National […]
- Soak, Dye in Light [Processing, Kinect] “Soak, Dye in light.” by everyware (2011) is an empty canvas but when you touch it, its elastic surface stretches and gets suffused with projected vivid colors mimicking fabric absorbing dye. Poking and rubbing with hands or resting their body on this spandex canvas allows visitors to soak this canvas in virtual dye and create own patterns. Dying fabric is a time-honored tradition of humankind. Local materials such as herbs, flowers, rocks, juice of animals or shells have been used through the dying process. Especially in Korea, people have deep affection toward the unique colors and textures of fabric dyed with traditional materials. Now in the age of new media, we tried a whole new way of coloring fabrics with the essential materials of new media, ‘light’ and ‘interactivity’. Also, as a meta-creative interactive installation, ‘Soak’ can be expanded for creating garments with personalized patterns or textile productions using today’s digital fabric printing technologies. Created using Processing and Kinect. Simulated watercolor by GPU accelerated cellular […]
- IRIS by HYBE – New kind of monochrome LCD display Created by Korean collective HYBE, IRIS is a media canvas with matrix of conventional information display technology, that is a monochrome LCD.Through the phased opening and closing of circular black liquid crystal, IRIS can create various patterns and control the amount (size) of passing lights. IRIS is an interactive medium for visual simplicity which uses the passage of ambient light, not emission of light itself. The installation below consists of 400 LCDs (20x20), 20 Custom-designed Arduino compatible controllers and Processing and Kinect used for both autoactive & interactive content play. HYBE IRIS was selected and supported by the Da Vinci Idea Program(2012) by Seoul Art Space_Geumcheon, […]
- Muze [Arduino, Sound] Muze is the latest creation of Teaugue Labs, a collective operating alongside Teague, a long established industrial design agency and a name behind products for companies such as samsung and microsoft. The device itself is a musical instrument that 'plays with you' and aims to provoke a two way dialog between musicians. The device reads a palette of notes that it can in-turn interpret and compose into various rhythms and phrases that are strung together to form something musical. The user can then influence these strings of notes and rhythms to create entirely new compositions using a simple collection of 8 triggers/knobs which are manually inserted. No single knob controls a single function, but rather a blend of functions derived from it's rotation. The team explains the thinking: A couple of us started talking about the state of musical instruments, digital music creation, and how so much of it buckles under the weight of heavy user interfaces and the desire for more knobs, buttons and faders. What if we were to create a device that sings to you and has its own musical inclinations, yet can also engage in a two way dialog with another musician? Not something that can be controlled as much as be guided and influenced – and as a result guides and influences the user. But Muze also has its own desire to explore and will continually improvise on the melodies it creates with you. It is out of this ability for it to self-create that Muze becomes a partner and not just an instrument. For instance, we have played with it and then left it to play over lunch. When we return it has come-up with something completely new, yet derivative. Sometimes what Muze creates is enjoyable, sometimes not. At which point you give Muze a little nudge and it creates something new. All of the code and circuits are open source. You can check out the Arduino code and Eagle circuit schematic on the site. The team is planning to make it more musical, robust, and simple and would love to hear your thoughts, suggestions. Project […]
- Hit The Beat – Physical drum machine by Lorenzo Bravi Hit The Beat is a physical drum machine that can play *anything*, making it possible for everyday objects to become musical […]
- illucia [Processing, MaxMSP] illucia is a OSC based codebending instrument by Chris Novello aka paperkettle. It is a USB device with physical jacks that correspond to software patch points, which can be connected and disconnected using patch cables. It is also a console for routing information between computer programs, and strives to create relationships across systems that don't usually interact. Chris has already designed a number of applications that interact with the console, some using Processing, others using MaxMSP. Whilst the applications themselves are quite simple they are nevertheless means to raise questions how controlling a particular application via a specific interface can change the experience of it. For now, there are four applications Chris intends to release as downloadables. Even though they still require the Illucia console to experience fully, they are OSC based to they can be controlled via any OSC interface including a number of iPhone/iPad/Android applications already available. The four existing applications for illucia are (see video): ·PCO (Paddle Controlled Oscillator): a classic ball and paddle game. When pushed, it morphs into a function generator and spills abstract art. ·Soviet Life Sequencer: falling Tetromino pieces generate step sequencer patterns, all remixable by Conway's Game of Life. ·War Machine: a crosshair blasts colorful explosions into a dense nest of shoots that approach from above ·Pile of Secrets: a codebendable text editor More videos and deeper documentation is on the way... In the meantime you can follow on Twitter or FB for more information. Project Page (Thanks […]
Posted on: 18/12/2012
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google