Created by Dennis P Paul, An Instrument for the Sonification of Everday Things is a “serious musical instrument” which rotates everyday things, scans their surfaces, and transforms them into audible frequencies.
A variety of everyday objects can be mounted into the instrument. Their silhouettes define loops, melodies and rhythms. Thus mundane things are reinterpreted as musical notation. Playing the instrument is a mixture of practice, anticipation, and serendipity.
The instrument was built from aluminum tubes, white POM, black acrylic glass, a high precision distance measuring laser (with the kind support of micro-epsilon), a stepper motor, and a few bits and bobs - Dennis writes. A custom programmed translator and controller module was written in Processing and using JSYN, transforms the measured distance values into audible frequencies, notes, and scales (minor, major, pentatonic, … ). It also precisely controls the stepper-motor’s speed to sync with other instruments and musicians.
- faderTouch 3.0 with AV Instruments [Processing] VJ Fader showing off the latest faderTouch 3.0, a portable rear projection touchscreen interface developed by him. The display can be viewed from front and rear. Audio visual instruments are programmed in Processing triggering sounds in Ableton Live via MIDI. For more see his video channel on vimeo. UPDATE 12.04.2010: Added new video guide below (Thanks […]
- Microsonic Landscapes by Realitat – Transforming sound into matter Created by Realitat, Microsonic Landscapes is an algorithmic exploration of the music. Each favoured album is converted into a three dimensional representation and "proposes a new spatial and unique journey by transforming sound into matter/space: the hidden into something visible." Albums include works by Portishead, Anthony and the Johnsons, Nick Drake, Einsturzende Neubauten and Für Alina. Each objects mounts layers of tracks represented in a single cylinder. Created with Processing, printed with Makerbot. Project Site | Realitat Realitat is a research and experimental studio founded by Juan Manuel de J. Escalante in Mexico City. The work involves digital media, music, architecture, graphic design and art. See also reflection-ll by Andreas Nicholas Fisher Below is the trailer for Juan's upcoming generative art workshop at the Gray Area Foundation for the Arts. Buy your ticket […]
- Radius Music [Processing] Created by Dave Young aka henderson, Radius Music combines ideas of cartography and graphic scores as a means to produce sound. The device is an autonomous revolving machine that reads a distance value in real-time between itself and another object. As the machine slowly rotates and scans the room, it takes this radial distance and outputs it as a relative sonic frequency and a corresponding visual score. The synthesis techniques used to generate the audio include Frequency Modulation and Additive Synthesis, with phasing effects providing a procedural rhythmic element. The circular rotation of the piece evokes an idea of early sample culture - each device plays a loop of sound approximately 3 seconds long, sampling the position of people and objects in the space. Two devices are situated in the room, documenting sonically and visually the dynamic real-time readings taken from the rotation ultrasonic distance sensors. As people walk in and out of the room that these devices will be placed in, they will alter the readings of distance, and therefore, the sounds too. view sourcecode Built using Arduino, PureData, and Processing. Radius Music Henderson is a new media artist currently based between Dublin, IRL and Den Haag, NL. He works with electronics, sound and architectural ideas in order to create generative and interactive situations. He has finished his final year of study at The National College of Art and Design, Dublin, and has completed writing a thesis titled Generative Systems: Authorship, Obsolescence and Production in Brian Eno's 77 Million […]
- Power of One #Point – Refracting laser light installation by Shohei Fujimoto Created by Shohei Fujimoto, Power of One #Point is an installation exploring input and output using laser and reflecting […]
- 12 – Sound sequencer comprised of 12 digitally controlled music boxes 12 is a sound machine comprised of 12 custom built music boxes with specially tuned lamellae controlled by on/off switches and […]
- Kinect – One Week Later [Processing, oF, Cinder, MaxMSP] Last week we wrote about the wonderful work that happened over the weekend after the release of XBox Kinect opensource drivers. Today we look at what happened since then and how the Microsoft gadget is being utilised in the creative code community. In case you missed our post from last week, you can see it here: Kinect – OpenSource [News] Chris from ProjectAllusion.com got to play with the Kinect and one late night he made this little demo in Processing using the hacked Kinect drivers. The processing app is sending out OSC with depth information based on the level of detail and the defined plane. The iPad app is using TouchOSC to send different values to the Processing app. - Daniel Reetz and Matti Kariluoma have been playing with Hacking a Powershot A540 camera for infrared sensitivity enabling you to see Kinect projected infra red dots in space. Microsoft’s new Kinect sensor is garnering a lot of attention from the hacking community, but the technical specifics of how it works still aren’t clear. I am working to understand the technology at a fundamental level – my interest is in the optical side of Kinect. My ultimate goal is to make the sensor nearsighted, so that the depth resolution can be used to scan small objects. The first step in understanding a technology is to look at it — that’s why teardowns like this one at iFixit are so important. - Ben at KODE80, the creator of Holo Toy created also this quite wonderful demo of Kinect being used to track your position in space and show image on the screen based on your position thus creating an illusion of 3D image. Several months ago I threw together an OSX HoloToy demo that used OpenCV and the iSight camera to replicate the facial recognition head tracking used in the iPhone 4/iPod touch version. This seemed like a perfect place to insert the Kinect! The above video shows various scenes with the perspective controlled via the Kinect. At this point it is simply tracking a specified depth range however with motion tracking of the depth map and other techniques, this could be really special. - Philipp Robb has some early experiments with a Microsoft Kinect depth camera on a mobile robot base. Say hello to KinectBot. The robot uses the camera for 3D mapping and follows gestural directions. It's basically a pimped iRobot Create with a battery-powered Kinect which streams the depth and color images to a remote host for SLAM and 3D map processing. - Peter Kirn covered the work Ben Tan X was doing with the Kinect system to perform MIDI control. Result: depth-sensing, gestural musical manipulations! From the description: Coded in C#.net using this: http://codelaboratories.com/nui Very hacky ugly, yucky, alpha prototype, source code available here: http://benxtan.com/temp/pmidickinect.zip Next project is making a version of pmidic that uses Kinect. Then, you can control Ableton Live or any other MIDI software or hardware with you limbs. Isn’t that amazing!!! If you are interested, you should also check out: http://pmidic.sourceforge.net/ http://benxtan.com - Yesterday, Stephan Maximilian Huber posted this video of Joy Division-esque realtime 3D scan using Kinect where points are connected only horizontally. Very effective and quite beautiful. - Simultaneously, Dominick D'Aniello is working on Kinect Object Manipulation, creating a system using openFramework that allows you to rotate and manipulate 3D objects using Kinect. A threshold is used on the depth-map to filter out everything but my hands, and then blob detection is used to locate their centers. This information is then used to scale and rotate an onscreen object. Note that because the Kinect provides depth information, the object can be rotated on both its Z and Y axis. With a bit of work, a gesture could theoretically also be made to rotate along the X axis. - Few days ago we posted a quick installation prototype by Theo Watson and Emily Emily Gobeille (design-io.com) with the libfreenect Kinect drivers and ofxKinect (openFrameworks addon). The system is doing skeleton tracking on the arm and determining where the shoulder, elbow, and wrist is, using it to control the movement and posture of the giant bird! - Another great news is that Kinect now also works with MaxMSP created by Jean-Marc Pelletier. It's still very alpha. I still have to implement "unique" mode, multiple camera support, proper opening/closing, and I can't seem to be able to release the camera properly but the video streams work as they should. Read more on the forums. - Also, Kinect now runs in VVVV. Late evening live coding at node10 by Julien Vulliet (thanks @defetto) - Last week Rui Medeira also ported drivers to Cinder framework and this morning Robert Hodgin aka Flight404 posted these videos to his vimeo account. Made with Cinder and the Kinect sensor. Runs in realtime. Another great week of Kinect projects. The work is finally beginning to take shape beyond tech demos which is wonderful to see. I highly doubt will be posting any more updates of this nature as more work will develop as individual projects which will require their own posts. Big up once again to the communities including openFrameworks, Processing, Cinder, MaxMSP and many […]
- µtagger Alpha: A GML Field Recorder [c++, Objects] Last September, GML group launched The GML Field Recorder Challenge. The project was an invitation to artists and hackers to design a DIY hardware and software solution for unobtrusively recording graffiti motion data during a graffiti writer’s normal practice in the city. The winning project would receive 1,200 euros. Earlier this year, our guest writer and contributor Joshua Noble took on the challenge by creating a custom piece of hardware for recording graffiti tags that costs no more than $170.97 or 126.81 Euros. The setup is simply based around using an optical mouse that knows how it's being tilted. The ultra-bright LED was installed to ensure that the mouse can read properly off the wall, a combination of special lenses, acrylic cut casing, batteries, SDCard slot, Teensy++ board, and a bunch of wires. Once you have your tagged data saved on the SDCard, it's easy enough to use it in one of freely available GML processing/openFrameworks apps to visualize or cut templates. Josh has posted a detailed walkthrough the making with code and templates available for download. Next steps include trying out some different optical sensors like the ADNS9550 for higher resolution and he will also be making an iPod/iPhone OF app that you can use with a simple chip plugged into the bottom of the iPhone to grab signals directly from the tagger and live log. More info on Josh's […]
- Workshops: Mind the Beep by Yuri Suzuki + Camera Drama by Roel Wouters In November last year ECAL University of art and design Lausanne in Switzerland was the host to two very interesting workshops aimed at their Media & Interaction and Industrial Design students. One workshop was led by Yuri Suzuki and the other by Roel Wouters. Having visited ECAL last year for end of year reviews it is one of the most fascinating schools I have been to. Regardless of the beautiful building designed by Bernard Tschumi, both the diversity and quality of work as well as the selection of invited lecturers continues to draw our attention. Mind the beep by Yuri Suzuki This one week Workshop was led by the London based, japanese Designer, Yuri Suzuki. Projects were realised by the students in Media & Interaction Design in collaboration with the Industrial Design students from ECAL. Workshop based on the concept of re-designing soundscape. In this workshop, re-consider and re-design alert sound such as alarm clock, ringing sound of mobile phone and bike bell. Improving surroundings with sound. www.yurisuzuki.com That was our first workshop with M&ID students collaborating with Industrial design students, both in the Bachelor level. The students were not so used to work with sound or to generate sounds so they also went trough a lot of experimentations and trials. At the end we had a collection of object playing with the topic, all in the form of working prototypes. The video contains just a selection of them. One of these projects, the high-heels, will be presented next week in Milan during the Salone del mobile in the exhibition called «Savoir-Faire» from ECAL at Spazio Orso 16. A Camera Drama by Roel Wouters This workshop was led by the dutch designer Roel Wouters. Projects were realised by the students in Media & Interaction Design in collaboration with the Industrial Design students. The brief included building an apparatus that produces videos the world has never seen before. www.roelwouters.com The workshop also mixed M&ID students and Industrial design students, and was happening in parallel as the workshop with Yuri Suzuki. The students worked on embed cameras and developed apparatus for these cameras (using GoPro cams, smarthphones etc.). The video contains just a selection of them. Some of the projects will be presented next week in Milan during the Salone del mobile in the exhibition called «Savoir-Faire» from ECAL at Spazio Orso 16. -- For more information on courses available at ECAL, […]
Posted on: 17/09/2012
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google
- Web Designer and Developer at the School of Visual Arts