Created by Amnon Owed, this is an experiment using KORG nanoKEY and Processing to create visual interpretation of sound into objects. While a tone is played, all objects originating from that tone grow in size. The strength that is used to tap the keys (velocity) affects the object’s initial location on the screen. So depending on how the keyboard is played, different visual compositions are created. In this case, Amnon is playing with black and white circles resulting in some quite bold but beautiful graphic compositions.
The midi signal is sent to Ableton Live simultaneously. I wasn’t feeling the audio though, so I decided to use another track instead. In the end it works out pretty well. Starting out purely as a proof-of-concept experiment, I didn’t really go overboard with the complexity of the visuals. I’m a sucker for circles anyway, but with the black-and-white the result is pretty stylish in my opinion.
Amnon Owed is a graphic designer currently located in the Hague, the Netherlands. His interests are graphic design, visual programming, sound design, motion graphics & animation. See more of his work at amnonp5.wordpress.com, his Flickr and Cargo.
- Quasar [Processing, Sound] Quasar is an audiovisual artwork created by envis precisely using Processing and Ableton Live for the fourth issue of WEAVE magazine (weave.de). The team writes: It is an audiovisual composition that is based on a single set of MIDI notes. Those notes influence both music and graphics at the same time. We did not take the usual route of creating a piece of music first and then find some visualization for it; instead both elements have been developed at the same time. When we decided to change something with the graphics, the changes immediately influenced the music as well. envis precisely is a studio for interactive art and design based in Munich, Germany, founded in 2009 by Thomas Gläser, Markus Jaritz and Philipp E Sackl. See also their first iPhone App: […]
- A Drifting Up [Processing] Another piece by Reza Ali who we mentioned yesterday (2D SuperShapes). This time around, it's an audio reactive Processing application simulating a live organism. Make sure you watch it in HD. This is an audio-reactive algorithmic visual art piece that uses the concept of charged particles and flocking to simulate a organism that is alive and composed of micro-organisms. The movement is rather pleasing thus I decided to exhibit the algorithm in a rather catchy video art fashion. Audio: Jon Hopkins – “A Drifting Up” If you like this, you may also like COP15 Identity […]
- flight404 at Decode / V&A [Events, News] Robert Hodgin aka flight404 has just posted this video of an application he is working for the Decode event at London's V&A to open next month. Robert was asked to rework his older Solar piece so that it could be audio responsive in real-time. Whilst the details of the actual exibit are yet unknown, it is nevertheless exciting to see Robert's work at the V&A. Video at the bottom is the older piece but do make sure you watch at HD / full screen. He will be joined by the names such as Golan Levin, Daniel Brown, Daniel Rozin, Troika and Simon Heijdens. More about the event here. 8 December 2009 - 11 April 2010 // Curated in collaboration with onedotzero (via Homage to Radiolab « all manner of […]
- TypeStar [Processing, Sound] TypeStar is a lyric visualizer created by Scott Garner that renders lyrics of a song in realtime according to a number of preset visualization schemes. The Processing sketch can be controlled via keyboard and mouse along with rough support for SMS control on laptops, iPhone control via OSCemote and oscP5, joystick control via proCONTROLL and midi controller support via proMIDI. Songs can be added by tracking down an UltraStar text file for a given song along with an mp3 and album art in jpeg format. Application is available as Mac, Windows and Linux download. (Thanks […]
- Dokfest Forest Identity [Processing] For the 26th edition of the Kassel Documentary Film and Video Festival, FIELD designed an identity based on the festival’s film submission database. Set in a thick and obscure forest like the wooded surroundings of Kassel, the colourful spheres form a sculptural representation of the programme – each of them represents a film, video, or installation work shown at the festival. A unique structure emerges from the forest when hundreds of these individual objects come together – like the festival brings together artists and visitors from all over the world, regional talent and established filmmakers, professionals and interested locals. Each film is represented by a sphere, with the size showing the length of the work. When two films coincide in all 3 parameters, meaning their spheres would sit in the same position, they cluster around this position like grapes on a vine. A generative colour palette assigns a unique shade to each represented work, which it keeps throughout all diagrams. The forest in the images was rendered using luxrender and took about 8 hours on a large amazon ec2 instance. Geometry was generated in a custom Processing application and then imported into Blender. See images below + make sure you visit field.io for more wonderful work by the London based studio. For more great Processing projects on CAN, see […]
- Daiku [Processing, Sound] A few months ago, Marc Tiedemann got asked to come up with an animations for the upcoming documentary "DAIKU | 10000 Japaner singen Beethovens Neunte" to be aired 31.12.2011 15:20 on ARTE. They required a main title and graphics within the movie. Friedemann Hottenbacher, the director of the piece, showed him the work of Malinowski which turned out to be the key inspiration for the graphics. Thinking of Japan and Osaka, big cities as well as tradition came to mind. So the result should in a way resemble both while still keeping the aesthetics of a notational system. So I started to program my very own Music Machine based on a simple FFT analysis via processing and played around with shapes and the depth in 3D space. The final two pieces below created using Processing are the result. The one below also contain the japanese translation of the title within the graphic, added using with After Effects. Project Page /via […]
- KAIST Mobile Phone Orchestra [iPhone, Processing, Sound] The KAIST Mobile Phone Orchestra (KAMPO) aims to explore the potential of mobile media for music and media art. In addition to suggesting new and innovative mobile performance paradigms through concerts, KAMPO conducts active research/education in music and mobile media as well as software development. The performances include five participants equipped with iPhones operating different components of the iPhone app, playing different instruments. Besides just triggering instrument sets in Ablenton Live, the main display application also creates a loop, as in a real orchestra, sending conduction messages back to performers and their devices. The iPhone app used in the performance is made using Apple iOS framework together with Momu Toolkit of Standford, MoPho (Mobile Phone Orchestra) for some functionality, especially OSC. The app is available for purchase on the AppStore and includes five separate interfaces (button, drawing, mic, accelerometer, compass) and one setting interface. The main display app uses Processing and receives performers' OSC messages to visualize the data. This main application sends conduction message to performers' iPhone via OSC, such as 'Start' message, buttons to press, lines to draw, direction to tilt, level to blow, direction of compass. Sounds are generated from the main computer running the main application, by using Ableton Live via MIDI message. KAMPO was the thesis (PDF) project of Sihwa Park as well as an AIM (Audio & Interactive Multimedia) Lab project. The team are currently preparing several performances using not only this app but also other applications. KAMPO Homepage. Director: Woon Seung Yeo, Co-directors: Sihwa Park, SongHee Jung Performers: AIM Lab […]
- Orbitone [vvvv, Sound] Orbitone is an ambient interface for musical interaction by means of tangibles and user motion. It was developed with vvvv, reacTIVision, OpenCV and Ableton Live. Media System Design, Media Arts & Science, University of Applied Science Darmstadt, Germany 2009-2010 (via 7nts on […]
Posted on: 18/07/2010
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG