Bla Bla Bla is a sound reactive application for iPhone and iPad created by the students at the design department of IUAV of Venice and later at the ISIA of Urbino. This app is a little selection of an exercise called “Parametric Mask” with aim to introduce students to “Procedures of Basic Design” and the basics of programming.
The new exercises introduce the use of a programming language as a tool that allows to solve problems. It’s not a programming workshop, but a way to use numbers, math and logic, which permits to show objectively how we solve a design problem. Students attend the first year of a design school and have never programmed before. The programming language used in the workshop is Processing. The porting for iOs was written in Open Frameworks.
You can also download the mac desktop app including Processing sourcecode here.
Developer: Lorenzo Bravi
- Visual Music Collaborative [Events] Led by Aaron Meyers, in collaboration with Re:Group artist, Aaron Koblin and in creative partnership with Ghostly International, Visual Music Collaborative is a summer school masterclass hosted by Eyebeam this July in NYC. Invited participants will explore the relationship between music, sound, and dynamically generated imagery and motion. Topics will include sound-analysis techniques, advanced OpenGL programming, and interfacing with mobile control devices. Guest speakers and musicians provide additional insight. The master class culminates in an event where participants perform using work created during the week. Participation in this program is via competitive application process only. Applicants should be at least at the graduate level of study, or have an emerging creative practice, and have established experience using OpenFrameworks, Processing, or an equivalent programming tool. Content created during this workshop will be released under Creative Commons licensing and may be promoted by Ghostly International. Qualified applicants can apply here. Applications are due May 21, and participants will be notified by May 28. More info on the event […]
- Visual Music Collaborative [Events] – Results This is a collection of work produced at the recent Visual Music Collaborative workshop hosted by Eyebeam this July in NYC. The event was led by Aaron Meyers, in collaboration with Re:Group artist, Aaron Koblin and in creative partnership with Ghostly International. Invited participants were asked to explore the relationship between music, sound, and dynamically generated imagery and motion. Topics included sound-analysis techniques, advanced OpenGL programming, and interfacing with mobile control devices. The selection below shows only a small selection of the work done at the workshop. As more appear on line, we'll add to the list although I am sure, as I write this, the organisers are working on collecting them all. For full list of participants see here. For more information on this and future workshops see visualmusic.tumblr.com + Eyebeam + Wiki (photo above - eyebeam's flickr stream) the Illusionist from Lars Berg on Vimeo. music: Shigeto, "the illusionist" made with openframeworks BETA: Visuals (x) CWCIII - Telefon Tel Aviv / Richard Devine from /// *** this.riley *** \\\ on Vimeo. OpenFrameworks live audio analysis / visual generation, OSC in/out (to be controlled or control other devices) 2D Geometry + GL shaders /// semi auto control via analysis or manually from performers input Special track CWCIII - Telefon Tel Aviv / Richard Devine for Charlie Cooper on Ghostly International Look at all the Smiling Faces — Shigeto from jonobr1 on Vimeo. This video is a recording of my visual performance to Shigeto's new track, Look at all the Smiling Faces, on July 23, 2010. This is the culmination of the 5 day long Master Class Workshop at Eyebeam (http://eyebeam.org/events/summer-school-masterclass-visual-music-collaborative) in New York. While listening to the track — in total 299 times this week — I imagined some kind of microscopic underwater scene. I wanted to merge this idea with my desire and passion to perform. With the guidance of the instructors, Aaron Meyers and Aaron Koblin, I decided to interface the application with my XBox 360 Controller. Built with Processing Visual Music Collaborative - Sieve test w/ Dabrye + Dog Eating Ice Cream from Will Calcutt on Vimeo. School of Seven Bells - Windstorm (Improv) from blair neal on Vimeo. This is a visual improv I recorded to the SVIIB song "Windstorm" off of their new album Disconnect from Desire. Unfortunately, my 4 year old computer is not able to chew through video and do screen grabs at a very high framerate so it's incredibly choppy, so please don't take it as anything polished. Built in Max/msp and jitter. Eyebeam & Ghostly International - Visual Music Workshop from George Michael Brower on Vimeo. Christopher Willits: Colours Shifting Ghostly International Performed live for this recording, using data gathered from the Echo Nest API. Made in Processing. The Old Man and the Sea from Evan Boehm on Vimeo. My week's output for the Visualist Master Class at Eyebeam NY, July 19th-23rd. This piece was written in C++ and is fully interactive. The basis of the course was to create a visual accompaniment to a song off the Ghostly International label. Choosing The Sight Below's 'Simmer' I decided to recreate Hemmingway's The Old Man and the Sea as a serious of 3D dioramas. In the novella, the protagonist has lost his pride and respect within the community because he has not caught a fish for 86 days. On the 87th day, a giant Marlin appears who he battles to catch. After a long struggle where he refers to the Marlin as 'brother,' he finally subdues the fish and brings him aboard. On the way back to land, the fisherman has to fight off numerous sharks who are attracted by the trail of blood in the water. Eventually, the fisherman returns to land with just the massive bones left. The ordeal can be read as the man's fight for his sense of self and purpose with the Marlin the physical representation. My interest lay in the use of polygon reduction as a storytelling tool. 3D graphics models are made up of a series of points (vertices) that are 'skinned' to create models. The more points, the smoother and more detailed the model. Could reducing/increasing the number of points on a model be used to drive a story forward or create a emotional response? Throughout the piece the character's emotional state is represented by both the polygon count and distortion of the water. Hopefully a first experiment of […]
- OFFF + CAN Workshop Collaborative 2011 [Cinder, oF, Js, Events] Earlier this year we have been thinking about the concept of "curated workshops", an opportunity to bring people together to work for a very short period of time and share their creations. These would include setting up a team, inviting few high profile individuals and opening up submissions for participation. When I was approached by Héctor Ayuso earlier this year to give a talk at OFFF, instead of talking about CAN, I thought this would be a great opportunity to do something more, a workshop, and use the workshop material as the content to drive the talk. Hector and I agreed, 'Workshop Collaborative' was born. What was the aim of “Workshop Collaborative”? 1. Initiate collaborations between those that share common interests. 2. Create a playing field, both physical and virtual. 3. Allow ideas to evolve by asking questions. When we announced the workshop back in January, we also opened to applications for participation. In total, 80 applications were submitted and 11 participants chosen by the team including Aaron Koblin, Ricardo Cabello - mr.doob, myself and Eduard Prats Molner. The participants included: Marek Bereza, Alba G. Corral, Andreas Nicolas Fischer, Martin Fuchs, Roger Pujol Gomez, Marcin Ignac, Rainer Kohlberger, Thomas Mann, Joshua Noble, Roger Pala and Philip Whitfield. Programme - Single Day 09:00 - 10:00 Introductions / Teams 10:00 - 13:30 Stage 1 13:30 - 14:00 Lunch 14:00 - 19:00 Stage 2 (Completion) Total creation time: 6.5 hours Few weeks before the workshop, Aaron and I decided four themes we should allow to influence the work we would be making. By allowing other participants to comment and feedback on these themes we would discover areas we all want to explore. The themes included: 1. Digital Ecosystem - Build an application, an organism of information, sound and visuals, a digital ecosystem that flows through different mediums and evolves. “living system - travelling through technology and mutates through tools. 2. Analogue Digital - Explores the notions of physicality in code. Using made objects as assets to code. Scan 3d objects, cut paper and cut-out, traditional 2d scans, 3d objects scanned using flatback scanners, etc.. 3. Projection Mapping - Address projection mapping conceptually. Moving away from technical demos, time to question what does it all mean; surface, source, angle, point projection, scale, form, interaction, animation. 4. Data re-embodied - Tell stories through the juxtaposition of data sources and their methods of representation. How can we create new meaning, understanding and value from reinterpretation of data. By no means this ment that we would have to choose one over the other. The purpose was to get the feel where the interest lies amongst the participants and set up, so to say, a 'playing field' and allow first ideas to develop. We knew that working together for a single day we would not be able to produce anything of "finish" quality, rather focus on the subjects themselves and see what comes out. Following the feedback, a number of keywords were derived, to summarise our interests: ecosystem, data, scan, evolution, input, mutation, osc, node, rhythm, pattern, touch, physical, language, viewport and mobility. Five projects developed during the 6.5 hours of work. These included Kinect > WebGL bridge, Kinect Image Evolved, Input Device, Data Flow and Receipt Racer. -- Kinect > WebGL This project was the work of mr.doob, Marcin and Edu although other people were involved also. The task was to create a bridge between the Kinect and browser, allowing the real time feed over the web. Although aspirations were much higher than the time allowed, instead of utilising node.js server - which I understand was 99% complete anyhow, the team setled for feeding downscaled image data from cinder application using standard http requests to the three.js script which was reading the images at about 10f/s. Several rendering styles are presented below. First one is just simple point cloud done by Marcin for debugging while the rest was done by mr.doob using his amazing Three.js engine. Download .js code here. -- Kinect Image Evolved Simultaneously while Ricardo was working on the .js part, Marcin was exploring different ways of kinect image representation. In attempt to get away from standard kinect point cloud, we developed idea of trying slitscan effect with the point cloud. What this means is that the kinect point cloud was dispersed along the time lapse, different bands representing different moment in time. Likewise, Macin also was exploring what happens if the point location was reversed when particular depth was reached. What you see in the videos below are both effects. Code available soon. Thomas and Andreas were also testing different tools to manipulate kinect image. Meshlab, Blender were used to pull kinect point clouds and convert them into meshes which could then be render, distorted, split, etc. -- Input Device Marcin was also working on ways to control the input, ie how one could interact with the Kinect point cloud. We were toying with the idea of being able to assign different devices over OSC to different kinect body parts. This would allow for each individual to be assigned unique element of te point cloud and to interact with it. The first step was to use simple gyroscope datam sent from an iPhone over OSC. The video below shows what is happening. Likewise, Rainer and Roger were working on the iPhone application that would send the OSC data. Rather than just utilising gyro or accelerometer, Rainer was exploring different forms of interaction with the device, seeing whether a language could be evolved, one that would somehow enhance emotional attachment the kinect body parts. The videos below show and instrument like application that also has audio feedback. Code available soon. -- Data Flow With all the data moving, Marek was wondering if the input and output are in same medium, you can compare them, apples for apples, what would happen. Marek looked at the process of the loop by examining the image obtained by subtracting the initial input from the output so we're just left with the parts that change. For the loop algorithm, jpeg compression was chosen because it was easily available in oF and ubiquitous enough to warrant investigation. The boxy images are a result of feeding the jpeg "high" quality compression back into itself and subtracting it from the original. The finer images are using the "best" compression setting. Then Marek tried the same thing with sound (using logic), using first the original sound, then the encoded and seeing what is left. You can hear all the sounds below. Original / OFFFCAN Workshop Collaborative by filipvisnjic Encoded / OFFFCAN Workshop Collaborative by filipvisnjic Difference / OFFFCAN Workshop Collaborative by filipvisnjic Code available soon. -- Receipt Racer The receipt racer combines different in and output devices into a complete game. It was made by Martin, Philip and Joshua utilising a receipt printer, a common device you can see at every convenient store, small projector, sony ps controller and a mac running custom openFrameworks application. Print is a static medium, that's why, Philip, Martin and Josh explain, it was an intriguing challenge to create an interactive game with it. First the team tried to do it only with the printer as the visual representation but that seemed rather impossible. But then Joshua Noble came up with a small projector, perfect to project a car onto a preprinted road. There is no game without an input device. So they were lucky enough as at least one of them always carries a gamepad around. The cables connect back to the laptop running an openframeworks application the team wrote parts of. The app was entirely programmed during the workshop. Internally it runs something like the basic js game. Only a car driving on a randomly generated race track. Then it broadcasts its components to the external devices, prints the street and guesses where the car's projection is supposed to be to perform the hit test. That's the trickiest part. Everything has to be in sync and needs some calibration in the beginning. The paper also has a little bit of a mind of it's own and tends to slide around or curl. But that's nothing some duct tape and cardboard can't fix. It was a lucky day. Somehow everything was just lying around, waiting to be used. Even the stand and this plastic thing you would normally use to put in your name on a conference. Even the timing was perfect. Right at the end of the workshop we finished adding the details like a little score and the YOU CRASHED TEXTS. Project Page (code available) -- On Saturday we presented the creations. Regardless of the fact that Erik Spiekermann was presenting in the other OFFF room, we had a full theatre (500 ppl estimate) including another room where our talk could be watched on a large screen. Photo above by Arseny Vesnin CAN would like to thank all the participants at the workshop as well as Aaron and Ricardo for taking time off their busy schedules to take part of the workshop. For more information on the workshop and all future information/code/links see creativeapplications.net/offf2011 Photos by Jason Vancleave We leave you with OFFF Barcelona 2011 Main Titles made for OFFF by PostPanic (full screen […]
- Visual Music Collaborative 2010 [Events] Video of the recent Visual Music Collaborative workshop has just been added by Aaron Meyers to our group on Vimeo. We wrote about it few days ago with examples of work produced but the video below discusses ideas behind some of the projects including guest speakers such as Zach Lieberman (oF) and Peter Kirn (CDM). The event was hosted by Eyebeam this July in NYC and led by Aaron Meyers, in collaboration with Re:Group artist, Aaron Koblin and in creative partnership with Ghostly International. For more information on this and future workshops see visualmusic.tumblr.com + Eyebeam […]
- Grid [iPhone, iPad, oF, Processing] GRID is an interactive multi-touch sound visualization for the band Mathon and the ZKM AppArtAward 2011. Created for live events, the application consists of a desktop version for realtime graphic visualization of music - created using Processing and also an iOS version for interacting with the Processing app - created using openFrameworks. The basic appearance is based on a shape that deforms synced to an audio signal. A never ending journey through portal-like visuals, organic and technical scenes take the viewer into a surreal feeling atmosphere. Forming rapidly changing pictures out of those shapes the viewer seems to be part of electrical impulses catching short impressions of the human and his role in the universe..... Using an iPad or an iOS capable device people can diretly interact with the music visualization. By using Multi-Touch it is possible to manipulate the camera of the desktop visualization as well as changing between different scenes. Set up is quite simple as long as your ports are generally open. Using OSC, one app talks to the other, as long as they are on the same network and your port 12000 is open. You first check the IP of your desktop, fire up the desktop app then on your iPhone or iPad go into app preferences where all app settings are listed, find GRID remote and enter the IP address of the desktop. Now launch the GRID Remote on your iOS device you are good to go. The artwork is the result of a cooperation between the interactive arts collective Futura Epsis 1 based in Hamburg, Germany represented by Andreas Rothaug and the band Mathon from Switzerland who are responsible for the sound. The downloads are available for: iPhone / iPad (out next week), Mac OS 10.6, Windows and Linux. You can also get the new Mathon album "Terrestre" on iTunes or www.mathonmusic.ch. Platform: iPhone/iPad Version: 1.0 Cost: $0.99 Free Developer: Futura Epsis […]
- Yellow Tail [iPhone, iPad, openFrameworks] In the multifaceted world of iOS apps for iPhone, iPad and the iPod Touch, artists are beginning to carve a niche in the crowded landscape of Apple's Appstore. One trend amongst software artists is to repurpose work that was originally written for desktop computers into small, compact versions that run smoothly on these mobile devices. Although some of these artist-designed apps utilize the location-aware and networked capabilities that these devices offer, several artists are revisiting the interactive and non-connected aspects of their art. One such piece is "Yellowtail" by Pittsburgh-based media artist, Golan Levin. Reexaming his project "Yellowtail", originally released in 1998 as part of his "Audio-Visual Suite" of applications for the desktop computer, Levin has recreated the piece specifically for the iPhone. From his description: "Yellowtail repeats a user's strokes end-over-end, enabling simultaneous specification of a line's shape and quality of movement. Each line repeats according to its own period, producing an ever-changing and responsive display of lively, worm-like textures." The experience of using Yellowtail is a fluid reminder of what made software-based art such an immersive experience back in the day. Also revisiting his old screen-based work from the 1990s was Scottt Snibbe's who has released several apps for the store that have even made it to Apple's top download list. Maybe as Golan has said in the past, it's about time for Apple to introduce an "Art" category to the store. Current artist apps are automatically put into the "Entertainment" category of the store which is true to some degree, but hardly does most of these apps justice as a designation. At its core, Yellowtail it is an interactive software system for the gestural creation and performance of real-time abstract animation. Using a playful animated transformation of the user’s gesture, the software produces an ever-changing and responsive display of lively, worm-like textures. More general information about Yellowtail is available here, and a whole heap of really technical and historical context is here. Yellowtail was originally developed as a skunkwork project at Interval Research Corporation (1998) with support from Michael Naimark, and later at the MIT Media Laboratory (1999-2000) with support from John Maeda’s Aesthetics and Computation Group (ACG). The iOS ports were developed in openFrameworks and were created with the assistance of Max Hawkins, Lee Byron, Jonathan Brodsky, and enabling support from the OF iPhone crew (Memo Akten, Zach Gage, Theo Watson, Zachary Lieberman, and many others). Yellowtail can be experienced (with source code) in this interactive Java applet. or Download Yellowtail (Full-screen PC .exe, 2000) This is the full-screen, sonified version of Yellowtail, the first instrument in the Audiovisual Environment Suite (AVES). [196k zip file, for Windows2000/XP. Requires 700Mhz+ CPU; an nVidia geForce or other OpenGL graphics card; and a Soundblaster-compatible sound card.] Platform: iPhone/iPad (Universal) Version: 1.0 Cost: $0.99 Developer: Golan […]
- Tunetrace – iOS app by Ed Burton converts drawings to music Created by Ed Burton, formerly of SodaPlay, and now at Queen Mary, University of London, Tunetrace transforms photographs of drawings into […]
- rgb petri [Processing, iPhone, iPad] Created by Jeremy Awon and based on his processing sketch from about three years ago, rgb petri allows you to grow colors like bacteria in a petri dish and explore the spectrum of your iPhone & iPad display. Each perimeter pixel copies itself outwards, with slightly mutated color. Tap anywhere to start growth, tapping once in a screen corner to bring up the menu, pan and zoom by pinching is included, tap twice and hold over colour to sample a new starting colour. Tap the sampling circle to restart with this colour. You can also create a circle which blocks the growth by tapping twice over empty space. Whilst details of port are limited, I took the liberty to assume it was made using openFrameworks. Made with cocos2d. You can also try the java applet version here: jeremyawon.info/rgbp/ Platform: iPhone/iPad - Universal Version: 1.0 Cost: $0.99 Developer: Jeremy […]
Posted on: 26/04/2011
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google
- Web Designer and Developer at the School of Visual Arts
- Creative Front-end Developer at DelighteX GmbH