Created for the University of Sheffield’s Festival of the Mind by Noel Murphy and Jamie Salmon in collaboration with Professor Róbertus von Fáy-Siebenbürgen and PhD student Nabil Freij, the heliOscillator is an audio visual installation that visualises the oscillations of sunspot size and density in a 7 part modular screen.
Each module is embedded with light dependant resistors, allowing the light of the projector hitting the screen to be measured in 42 different points over its surface, each sensors measurement is then used to control the pitch of an oscillator. The software was created using the openFrameworks and the light sensor data is captured via an Arduino micro controller and interpreted as sound using Max/Msp.
The projection itself is a data visualisation based around observations of individual sunspots. Each hexagon section represents one sunspot, and the colour and shading of its subsections reflect the change in size and darkness of that spot over time. The data counts round the hexagon in these triangle sections like a clock, always displaying the last 6 data points for comparison. The 6 sub sections match up with the light sensors, so every change in colour and intensity is measured and reflected in sound.
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- The Space Beyond Me [openFrameworks, Arduino, Processing] The Arriflex 16 ST body with UV-light source and motorized zoom lens Julius von Bismarck and Andreas Schmelas have just open sourced the code of their collaboration project "The Space beyond me". The project includes an "Apparatus for reviving spaces that are captured in celluloid" and was exhibited at the Transmediale 2010 (Berlin) and several other festivals (right now it can be seen at the Ghent Film Festival in Belgium). The installation is able to construct a representation from celluloid film combining a modified 16mm camera with a UV-light projector. The device projects a film whilst moving in exactly the same way in which the camera operator moved the camera while shooting the film. What happens, if a projector moves while it is projecting in exactly the same way in which the camera moved that recorded the film, which is now being projected? What happens, is similar to processes happening in the brain when we perceive our surroundings. Virtual rooms or landscapes are composed from flat visual information, constructing a subjective representation of the world. The projector is placed centrally in a round room, the walls of which are painted with phosphorescent paint. The paint emits an afterglow of the image projected onto it, so that the moving camera-projector keeps adding to the image. After the film has played, all scenes of the film are reproduced in their correct location. The film, which originally recorded a spatial setting, has been translated from a time-based medium back into a space. The software for the installation (available to download here) consists of several parts and including a openframeworks scene arrangement application, arduino sourcecode and a processing app (responsible for parsing the output of the openframeworks into a arduino compatible progmem format array). The openframeworks part includes an application for extracting "camera movement" out of a video and an application for arranging "scenes" onto a virtual stage. Project pages juliusvonbismarck.com | andreas-schmelas.de The Space Beyond Me from fenomenologie on Vimeo. »The Space Beyond Me« still, Transmediale 2010 Various drafts by […]
- L.S.D + Light Sequencer [iPhone, PD, Arduino, Sound] If you liked the 'Extending the Touchscreen' project by Michael Knuepfel here are a few experiments by Benjamin Gaulon using displays to convert light to sound waves. He writes: Research and recent innovations have led to an amazing increase of types and uses of visual displays and screens; indeed, in our predominantly visual culture, they are everywhere. A typical person carries at least one device with a screen, is presented with them in public places, uses them at work and in many leisure activities. They are so ingrained in our everyday acts and habits that we don’t even notice them anymore. L.S.D invites its users to engage in a new perception of their daily environment. In ecology this class of relationship is called commensalism. L.S.D feeds on light via two LDR (light depending resistor) mounted on a suction cup, allowing the sensors to be mounted on any screen surface. An analogue synthesizer converts the light input to sound waves. This device can be used in many different configurations and feeds from any light sources. Even if L.S.D can be controlled by any light source, its design is aimed at screen reading/listening. More videos on Ben's vimeo. Project Page Benjamin Gaulon is a researcher, artist with a broad experience of acting as art consultant, public and conference speaker, graphic designer and art college lecturer. Issues like e-waste, obsolescence and disposable society have been the focus of his practice and theoretical research. 'Recyclism' research seek to establish an inter-disciplinary practice and collaborations by creating bridges between art, science and activism, and by doing so, shifting the boundaries between art, engineering and sustainable […]
- The Image Toaster – 6×6 pixel image of the day on your toast Created by Scott van Haastrecht for the Creative Technology course "Innovation Lab" at university, The Image Toaster will search the internet for images related to the days date and will toast it on your bread. Just in case you tend to forget those important dates The Image Toaster will remind you using a 6x6 pixel image. This is only a prototype and requires to be plugged into a computer, but hopefully at the end it only needs a wifi and power. The current setup includes Arduino, Processing, Max/MSP, maxuino and a bunch of steppermotors that shift the toast until the full image is created. Project […]
- Tele-Present Water [MaxMSP, Arduino] Created by David Bowen, Tele-Present Water installation draws information from the intensity and movement of the water in a remote location. Wave data is collected in real-time from National Oceanic and Atmospheric Administration (NOAA) data buoy station 46075 Shumagin Islands Alaska. The wave intensity and frequency is scaled and transferred to the mechanical grid structure resulting in a simulation of the physical effects caused by the movement of water from this distant location. The installation uses MAX/MSP to drive an Arduino mega running servo firmata. It uses 11 x 24volt dc motors with drivers for the movement. In May this year Tele-Present Water received one of three ex aequo awards in Alternative Now: The 14th Media Art Biennale WRO 2011, Wroclaw, Poland. //thanks for the tip Joost 11 x 24volt dc motors Photo by Alicja Kołodziejczyk - source Photo by Ewa Wójtowicz […]
- Cubepix by Xavi’s Lab at Glassworks Barcelona The special projects division of Glassworks Barcelona - Xavi's Lab created Cubepix, a fully interactive and real-time projection mapping installation in their studio that combines very simple materials (cardboard) with quite sophisticated tech (projection mapping, pixel rotation, sync, etc). It has been conceived and developed entirely by their resident technologist Xavi Tribo. The prototype has been devised using a projector, a Microsoft Kinect, 8 Arduino boards, openFrameworks, 64 servo motors and 64 cardboard boxes. Using all of the above, users are able to interact and influence the way the boxes move and are illuminated. Not only that, the software knows how and when the boxes are going to turn and projects onto them accordingly. Wonderful and fun project indeed. Glassworks […]
- Sound of Honda – Ayrton Senna’s Fastest F1 Lap (1989) in Light and Sound This project, a collaboration between Dentsu, Honda Motor and Rhizomatiks brings back Senna’s engine sound from that lap 24 years ago in the form of an installation set on the original Suzuka circuit that uses light and […]
- Skube – Tangible interface to Last.fm & Spotify Radio Created by Andrew Nip, Ruben van de Vleuten, Malthe Borch, and Andrew Spitz, Skube is a music player that allows you to discover and share music by physically interacting with custom designed cubes which act as an interface to Last.fm and Spotify. Two modes are included, dependant on the objects orientation, Playlist and Discovery. Playlist plays the tracks on your Skube, while Discovery looks for tracks similar to the ones on your Skube so you can discover new music that still fits your taste. When Skubes are connected together, they act as one player that shuffles between all the playlists. You can control the system as a whole using any Skube. To create the boxes, solidworks was used to design the objects and MaxMSP to coordinate the Skubes through a custom network. XBees allow the cubes to communicate wirelessly and each Skube has an Arduino inside of it. To play and manage the music Spotify and Last.fm APIs are accessed using MaxMSP. Arduino manaes all sensor inputs and outputs and an FM module plays and syncs the music between all the Skubes. Reed switches and magnets detect which Skubes are physically connected and Piezos detect the single tap to play/pause and the double tap for skipping. Project Page | Andrew Nip | Ruben van de Vleuten | Malthe Borch | Andrew […]
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google