Created by Jonas Breme, Listen Carefully is about raising awareness about music experience and a critique of digital music consumption aiming to spark debate about how we buy and experience music today. Avoiding passive experience, custom designed pair of headphones allow users to focus by disabling the playback if the user moves thus encouraging focus and time off from other activities.
The times we are living in are fast moving and hectic. Modern technology is altering the way we live, work and consume, supporting a fast and superficial lifestyle. A way to cope with these new challenges is hard to find. Nowadays it seems more and more important to critically reflect on one’s behavior in order to define a viewpoint and find a healthy way to live.
The consumption of music has changed in the recent years, mainly influenced by technological development of media and playback devices. In many situations, music listening has shifted from a main-activity to a side-activity (H. Weber has coined the term passive listening). Music is no longer in the center of our attention, but is heard while focusing on other things. Subsequently, due to the change in perception, music is in danger of degenerating into an unnoticed background noise.
Listen Carefully is one of three design concepts, developed as part of Jonas’ BA thesis about music-listening behavior at the Interface Design program at University of Applied Sciences, Potsdam. The objects and concepts are not to be seen as marked-ready products, but rather as critical design objects aiming to spark debate about today’s forms of consumption.
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- Muze [Arduino, Sound] Muze is the latest creation of Teaugue Labs, a collective operating alongside Teague, a long established industrial design agency and a name behind products for companies such as samsung and microsoft. The device itself is a musical instrument that 'plays with you' and aims to provoke a two way dialog between musicians. The device reads a palette of notes that it can in-turn interpret and compose into various rhythms and phrases that are strung together to form something musical. The user can then influence these strings of notes and rhythms to create entirely new compositions using a simple collection of 8 triggers/knobs which are manually inserted. No single knob controls a single function, but rather a blend of functions derived from it's rotation. The team explains the thinking: A couple of us started talking about the state of musical instruments, digital music creation, and how so much of it buckles under the weight of heavy user interfaces and the desire for more knobs, buttons and faders. What if we were to create a device that sings to you and has its own musical inclinations, yet can also engage in a two way dialog with another musician? Not something that can be controlled as much as be guided and influenced – and as a result guides and influences the user. But Muze also has its own desire to explore and will continually improvise on the melodies it creates with you. It is out of this ability for it to self-create that Muze becomes a partner and not just an instrument. For instance, we have played with it and then left it to play over lunch. When we return it has come-up with something completely new, yet derivative. Sometimes what Muze creates is enjoyable, sometimes not. At which point you give Muze a little nudge and it creates something new. All of the code and circuits are open source. You can check out the Arduino code and Eagle circuit schematic on the site. The team is planning to make it more musical, robust, and simple and would love to hear your thoughts, suggestions. Project […]
- GravSynth [iPhone] GravSynth is musical instrument app for the iPhone that allows you to produce and alter sounds using the touch interface and tilt motion (accelerometer) in the iPhone. The app includes analog synthesizer, arpeggiator and touch panel.Â Volume and pitch are controlled via the touch panel on the right allowing you play the analog synthesizer.Â As for arpeggiagtor, by using a combination of the eight touch panel pads on the left and different degrees of tilt, you can create a variety of musical patterns and phrases. You can use the app in a number of ways. Primarily, you can either add it to the music you already composed, play it together with your favorite tracks or with your friends using other iPhone music apps. The app's interface is absolutely wonderful. Simple tap on the edit button will bring up additional controls whilst using both fingers you can alter the pitch or volume of the track. Tilting the device will change the speed. Using all three controls you can produce some pretty awesome sounds. The musical phrases generated by the arpeggiator are based off of the "Lydian Chromatic Concept of Tonal Organizaiton," an influential jazz music theory.Â Tilting your iPhone (or iPod Touch) vertically will reduce gravitational drag creating a high-tension musical phrase where as holding your iPhone (or iPod Touch) horizontally will increase gravitational pull and create a more stable musical phrase.Â In this way, manipulating the arpeggiator allows you to reproduce complex improvisational pieces of music in the same manner as a jazz performer Features: â€¢ Musical performance via easy-to-use touch pad (including pitch and volume control). â€¢ Arpeggiator controlled by varying combinations of gravity (tilt) and an eight pad touch panel. â€¢ Phrase speed controlled by horizontally tilt. â€¢ Keyboard with 12 different musical keys to choose from. â€¢ Rhythm synchronization via TapTempo. â€¢Â Analog synthesizer (from open source codes. mobile synth http://code.google.com/p/mobilesynth/) Platform: iPhone Version: 1.1 Cost: $1.99 Developer: KAYAC […]
- SoundGyro [iPhone, openFrameworks] Latest is the series of sound apps from Henry Chu is SoundGyro, a sound toy that uses iPhone 4's gyroscope to alter pitch, octave or change the key. The app is still in development and Henry is still working on developing some advanced features. We asked Henry about SoundGyro, here is some info + inspiration behind the app. Once iPhone4 was released, I wanted to try on the gyroscope as it gives more accurate position/orientation information than the accelerometer. I had the idea to create a theremin like instrument that translate the hand position to musical notes, I did it on iPhone and Wiimote before, but the result didnt impress me, mostly because I cannot combine 2 axis of movement together, the reading will go very unstable. Another reason is that the sensor is not sensitive enough to capture small gesture, which is a big hurdle in creating expressive music, if I can only play long notes slowly. Using gyroscope I can combine 3 axis of movement together without messing up all of them. Moreover, the device can now detect the rotation rate, it adds another dimension of control. The iPad and 3Gs has a digital compass which behave in a similar way, but the sensor just got better in iPhone4. Using SoundGyro is simple, tap anywhere on the screen to start the sound, at this moment, there is no difference at where you tap, but I might add some more control for advance playing. Tilting the device upward and download to change the pitch. To move to the upper or lower octave, rotate the device, you can use up to +/- 3 octave. The rolling controls the volume. The default sound is A, in the setup page you can change the key. You can switch on/off the note snapping also that allow you to play discreet notes or slide between notes. Wonderfully, SoundGyro as you see in the video below was only one days(!) work. Henry is keen to do more so keep an eye on his vimeo account for new videos + demos. In the meantime, Sound Yeah bounced from the AppStore approval process for using a private API which Henry forgot to remove. Nevertheless, it is back in the approval queue and (hopefully) should be available in the next few […]
- Hit The Beat – Physical drum machine by Lorenzo Bravi Hit The Beat is a physical drum machine that can play *anything*, making it possible for everyday objects to become musical […]
- Lilt Line for Wii [Games] a retro rhythm racing beat ‘em up action and one of our favourite iPhone games ever is making it's way to Nintendo Wii. If you haven't played it before, Lilt Line is a wonderful example of when hardware and software play together in harmony. The sync between the sound, touch and tilt, is very unique to the platform and there are very few (if any) games like it. It's port to the Wii does make sense, considering WiiMote and must similar interface but somewhat different context. Of course, it would be great to see LiltLine on the iPad too but Gordon, the name behind the game, is slightly sceptical of iPad's size and tilt UI mechanics. The Nintendo Wii version of Lilt Line is being published by Gaijin Games. The release date is not known yet although nintendolife.com had a chance to have a play with the game and you can read the preview here. Previously: lilt line [iPhone, Games] Get the iPhone version here […]
- RHIFID Speakers [Processing] Using a combination of RFID technology, Processing and Arduino, Jacek Barcikowski, Filippo Cuttica and Ulrik A. Hogrebethe created an installation with location aware speakers allowing the user to interact with music and the environment by moving the speakers around the room. The RHIFID speakers were used for the project “This is a Journey into Sound” - an educational trip into the history of electronica, rock and hip hop from the past 50 years. A grid is mapped out using RFID tags (the red things on the floor), allowing each user of the two speakers to listen to a song individually, within a specific genre and decade by placing it on the RFID tag. Each RFID is mapped to a song iconic of that decade in the appropriate genre. Putting the two speakers together triggers the speakers into playing one common song, creating a social listening experience. The RHIFID speakers can also be modified into musical creators rather than just controllers, allowing location and rotation to control such things as pitch, samples and effects. Project page here and here. (use headphones when watching the […]
- flight404 at Decode / V&A [Events, News] Robert Hodgin aka flight404 has just posted this video of an application he is working for the Decode event at London's V&A to open next month. Robert was asked to rework his older Solar piece so that it could be audio responsive in real-time. Whilst the details of the actual exibit are yet unknown, it is nevertheless exciting to see Robert's work at the V&A. Video at the bottom is the older piece but do make sure you watch at HD / full screen. He will be joined by the names such as Golan Levin, Daniel Brown, Daniel Rozin, Troika and Simon Heijdens. More about the event here. 8 December 2009 - 11 April 2010 // Curated in collaboration with onedotzero (via Homage to Radiolab « all manner of […]
Posted on: 27/08/2012
- Freelance Interactive Producers at Psyop
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- 3D Technologist at INDG
- Creative Director at INDG
- Lead Developer at INDG
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific