iOS, openFrameworks, Sound
Leave a comment

NodeBeat [iPhone, iPad, oF, Sound]


NodeBeat is a generative music app for the iPhone and iPad in the style of apps like Bloom. In it, gently drifting nodes interact with pulsing triggers producing minimal soundscapes. It offers several options to tailor the parameters of the music generation. The overall experience is very soothing and meditative; background gradients  swirl through prismatic color changes.

Below is a mini-interview with the authors of the app, Seth Sandler and Justin Windle:

What is the inspiration for NodeBeat?

I was reading a lot about sync at the time and was interested in experimenting with systems which could be free to form their own patterns, with the possibilities of spontaneous and unexpected ‘syncs’ emerging. Setting up rules to describe quite loosely coupled processes and then observing whether some kind of order emerges from them is something that I find fascinating. Aesthetically, I wanted the system I programmed to feel organic and so chose to work around the analogy of neurons and synapses firing – something talked about a lot in sync theory. These concepts concern themselves a lot with rhythm, repetition and cycles and so using audio seemed like a natural step, which is what gave rise to the idea of a sequencer of sorts. I had recently been introduced the Tonfall AS3 library, written by André Michelle and was keen to experiment with audio synthesis in Flash so I build the initial experiment on top of this as the library had a pretty small learning curve. The user interactions and the ability to tweak parameters at runtime, in order to produce different types of patterns; was something I added to give a layer of feedback to the experiment and push it more in the direction of an instrument that to an extent can be played rather than just observed.

I’ve always been inspired by the simplicity of iOS applications like Bloom and how something simple can appeal to many and create a variety of musical interactions. When I saw Justin’s sequencer, I thought that it was a great example of something that was simple, yet created an interesting, visual and fun experience. Since, I typically focus on musical applications that aren’t targeted at musicians, I thought NodeBeat was a great fit.

What were your goals?

The goal was really no more ambitious than simply producing something interesting to observe and interact with. It began as a study and not a product. I personally find that goals are very hard to define during the experimental process, as by definition you want to remove any finite end-point and instead explore the space you’re in at that time; feeling free to branch, destroy or enhance whatever you’re working with. It wasn’t until I posted the first experiment and realised that other people seemed interested by it that it became clear it could be refined into something more usable. This is why I was so excited when Seth got in touch about developing it for devices because it was clearly something which suited that format perfectly.

In terms of NodeBeat, the iOS implementation, the main goal is really just to provide a fun musical experience. While we had a variety of additional ideas to implement, we really wanted to keep the feature set simple and easy to understand on the first release. The hope is that anyone, musical or not, is able to experiment and create melodies and rhythms without having to learn anything or understand how notes, octaves, or rhythmic subdivisions work.

NodeBeat is developed using openFrameworks and utilizes PureData for audio synthesis. You can download the free desktop version + source code at

Platform: iPhone/iPad
Version: 1.0
Cost: $0.99
Developer: AffinityBlue