This is the first of a number of posts where we try to document both the results and howtos that happened at the most recent Resonate festival that took place in March this year. We begin the series with “Audio Reactive Mapping with TouchDesigner (Derivative)” led by Dimitry Napolnov (Sila Sveta) with Barry Threw (Obscura Digital), and Markus Heckmann (Derivative) with support from Greg Hermanovic and Isabelle Rousset (Derivative). Big thanks to Derivative for collecting all the information and sharing it with CAN. For full post about the event from Derivative’s perspective see here.
Abstract: Three things seem to surround us all the time and in great abundance: data visualization, audio visualization and mapping. We will show you how to do all three things at once using TouchDesigner. In the course of this workshop we will demonstrate techniques used in TouchDesigner to pull data off the web, visualize the data and enhance your creation with some real-time audio analysis. We’ll also map the whole thing onto a simple physical object. Using partially prepared components, you will learn how the different operator types in TouchDesigner can be used to process and convert data, and to generate 3D elements and 2D textures. You will also see how the software can be used to build tools that are reusable for everyday projects.
Workshop took place in the main hall at Dom Omladine alongside 15 other workshops. Each workshop had about 14 dedicated participants working along with the instructors for the duration of the day while other festival attendees were encouraged to walk around and listen in at the various stations for as long as they liked.
The videos here document the part of the workshop having to do with the audio-reactive, video effects and mapping demonstration which is based around a pre-made TouchDesigner 088 file (Resonate.toe) that is described below and downloadable here. It is recommended you download and install the file and work along with the video.
The Resonate.toe File and Description
For the workshop TouchDesigner team prepared a file in TouchDesigner 088 (Resonate.toe) which contains 3 parts:
1. the Echo Nest component in /resonate/echonest which sends a music track for analysis to echonest.com and converts the detailed results to animation channels
2. the section which creates the visuals for the mapping
3. CamSchnappr to map the visuals onto the physical object
The Echo Nest Component
Echo Nest describes itself on the website as: “…offer[ing] an incredible array of music data and services for developers to build amazing apps and experiences.” (Source: http://developer.echonest.com/)
Its developer API accepts music uploads for analysis and returns a detailed description of the track including time signature, key, tempo, timbre, pitch and sequenced information on beats, bars, sections and much more. The documentation of the analyzer can be found here: http://developer.echonest.com/docs/v4/_static/AnalyzeDocumentation.pdf
To start using Echo Nest developers will have to acquire an API key: https://developer.echonest.com/account/register
The Echo Nest component comes with a simple user interface where the API key has to be entered and a URL to a music track has to be specified.
TouchDesigner will download the music file into temporary storage and start playback (this might take a while depending on the size).
Clicking the Fetch button on the Echo Nest component user interface will start the process of analyzing the track.
After a short while the analysis should be done and all tables in the Echo Nest component should be filled.
This is a multi-stage process:
- First Echo Nest is told were the track is located via its Track API Method (http://developer.echonest.com/docs/v4/track.html#upload). Inside TouchDesigner this is done via a Web DAT (/resonate/echonest/webUpload), using the POST Method to Submit and Fetch the information to and from Echo Nest. The Web DAT is fed with the API key and the music file’s URL.
- TouchDesigner eventually receives a Track ID back from Echo Nest as part of a JSON formatted response also containing meta-information like Artist, Title and more. The Track ID is used in the next stage to receive a detailed audio analysis.
- Using the Track Profile API Method (http://developer.echonest.com/docs/v4/track.html#profile) with the Web DATs Fetch Method (/resonate/echonest/profile), Echo Nest returns info on a Track given its ID. The information contains beside links to previews, images and more artist information, a URL to a complete analysis file. TouchDesigner parses the returned JSON for this URL and uses it in yet another Web DAT (/resonate/echonest/audio_summary) to fetch the complete analysis.
- This audio_summary is a JSON formatted package containing sequenced information on beats, bars, sections and more which is parsed in TouchDesigner with a python script (/resonate/echonest/decode) and its content is passed on to a collection of Table DATs
- The Table DATs are being fed into an Animation Component and via a Script DAT (/resonate/echonest/animation1/script1) converted into animation channels and keyframes.
- The output from the Animation Component now can be used in the synth as an animation channel.
Echo Nest has a lot more to offer in regards to song analysis and track recognition, the documentation at http://developer.echonest.com/docs/v4/index.html has a good overview of what else is possible.
The Geometry and Animation
The sample installation works with a fairly simple setup of 7 cardboard boxes. The geometry for the 7 boxes is generated with 2 techniques:
- Instancing, with the channels for the instances created in the instanceData Base Component (/resonate/instanceData) and
- Using the Copy SOP to create the object.
A series of Render Pass TOPs are used to render the different visualizations. Besides showing various techniques on how to apply textures to geometry it is also shown how shadows are created with the Geometry components shadows and shadows1, the shadowLight and littleHelper Light components.
The component soundAnalysis (/resonate/soundAnalysis) explores one way on how to convert the audio waveform to via the Spectrum CHOP to meaningful animation data.
Eventually CamSchnappr is used to map the output onto the physical object.
Get the latest version and instructions on how to use CamSchnappr on the Forum.
Resonate 2013 Workshop Videos
Note that these videos have been only minimally edited and that videos number 2, 3 and 4 which are taken from a second camera with lower resolution are posted here as-is. Editing would have required they be output again and they did not want to incure any further loss of resolution. While these three videos are lower res than the rest of the program the content is entirely discernable – and highly instructive!
Watch the videos full screen and in HD wherever possible and turn up the volume!
TouchDesigner Workshop Resonate 2013 | Video 1 of 9 | Introductions
This short video introduces the workshop crew and their fine participants who tell them who they are, where they come from, what they do and hope to do and what their experience with TouchDesigner has been to date which for all of us, was very interesting to hear!
TouchDesigner Workshop Resonate 2013 | Video 2 of 9 with Markus Heckmann
Markus’ overview of project: Markus introduces the main networks. There’s a sound input file and a component from Echo Nest URL. Upload a sound file, they anayze in detail. Coming out of Echo Nest is a large file – bars, feeds, sections etc. Gives pitch information. JSON format – a dictionary or an array. Parse and get tables with numbers gives us data. Converted data into keyframe animation into a long timeline. Track info as you play back the track. Pitch, loudness info etc. used to drive certain things. Using channels to animate textures and geometry. Basic geometry – box. Use various techniques to render box out. Show instancing – pass it positional data – where to go, how to rotate. Low CPU usage. Lots of instancing minimally affecting render speed. Will show techniques on how to render textures differently. Wire-frame, normals, shadows. All comes together and controlled by the output from Echo Nest via the sections. Then mapped with CamSchnappr which calculates the projector position. Need an accurate model of what you are projecting onto. (Video courtesy of Daniel Georges)
TouchDesigner Workshop Resonate 2013 | Video 3 of 9 with Dmitry Napolnov
Dmitry’s overview of project: Closer look at the network which is about sound and data visualization and mapping the results. Nodes (Operators) in networks can import, generate or process and output data. They are about external communication, data generation and data processing. Classify nodes by these types of operations. Audio File into TouchDesigner – generates data that can be processed and output as sound visualization. Echo Nest takes sound file and returns data file of sound analysis. End up with a set of channels providing info about the song: bars, beats, segments, frequencies, pitches etc. Audio Stream In – plays the track itself. Each CHOP has a number of samples. Use Sample CHOP to analyze and change number of samples. Analyze CHOP. Wave CHOP. Analyze the bass. We have 2 data sources. Markus’ plugin and data processed by TouchDesigner from the Audio file. Make geometry first. Simple box generated with TouchDesigner. Processed with Transform node. Make more boxes using the Copy SOP. CPU vs GPU. Translate > change rotation. Merge Node to see final geometry. Value ladder in parameters – to move or increase or decrease value. Instancing: to make a lot of geometry e.g. 1000 copies via GPU-driven processing. 088 displays instancing right in the viewer of the nodes vs. needing a Render TOP. (Video courtesy of Daniel Georges)
TouchDesigner Workshop Resonate 2013 | Video 4 of 9 with Dmitry Napolnov
Instancing geometry: Generating patterns/data/noise/waves via instancing data. Instancing: Instance Count Set parameter = the number of copies we want. Can be manual or use CHOP length (number of samples) to generate value. 1 copy per sample. Each copy uses 1 sample. This is how instances are generated. Instancing – every copy is exact same geometry as all copies. When you make copies, each copy can be unique – change primitives or points etc. Copies are generated on CPU (slow), instancing on GPU – fast, economical. Changing primitives. Grouping primitives. Click “?” on any node to find out more about it in the Derivative Wiki. Primitive Node to operate primitives. Point Node to operate points. Generating visuals from downloaded data. Visual changes on each beat. Generate texture from data. Applying texture to geometry. GLSL shader. Texture Operator applies texture to coordinates. Generating data from sound e.g. in live VJ setup with DJ. ASIO for real-time accuracy. Samples become channels to generate data. (Video courtesy of Daniel Georges)
TouchDesigner Workshop Resonate 2013 | Video 5 of 9 with Dmitry Napolnov
Processing samples with CHOPs: Deleting unnecessary channels. Control behavior of each channel. Process channels to get smooth effect. Trigger node to control decay process and get smooth movement (non-strobe). Lag. Channels become samples. 290 Sample CHOP but only 5 objects to animate. How to make less samples. Compress to 42 samples. Looks different as not easily compressed. Interpolation: cubic, linear, none – working with in-between values. No interpolation data varies between 0 and 1 – fraction – so round them off to 0 or 1.
TouchDesigner Workshop Resonate 2013 | Video 6 of 9 with Dmitry Napolnov
Connection audio to effects on the cubes: We analyzed raw data to get curve and Markus analyzed already prepared data (Echo Nest). Compare the curves, slightly different, raw sound less predictable of course. Processed data cannot be processed every frame = less accurate or ‘full’. Cross node blends between CHOP inputs. Blend multiple inputs. UI elements in TouchDesigner Palette. How to animate 42 primitives (7 boxes with 6 sides). Expressions and functions. Animation based on primitives or points. Assign samples to primitives.
TouchDesigner Workshop Resonate 2013 | Video 7 of 9 with Dmitry Napolnov
Image mixing and compositing effects: Render geometry as wireframe. Edge Blend. Mix and control effects we’ve created. Pipeline to rendering multiple TOPs. Using Render Pass TOP, saving GPU memory. Composite TOP – possible to have multiple effects in one node (a multi-input TOP that will perform a composite operation for each input). Glow effect with Blur. Traces using Feedback TOP (has history “inside”). Projecting, how-to. 3 modes of working in TouchDesigner: design mode, mixed mode, perform mode. Performance Monitor window used to analyze CPU activity in one frame. Container used for projection. Sizing (1920 x 1080 e.g.). Output Container to Projector via Window Component.
TouchDesigner Workshop Resonate 2013 | Video 8 of 9 with Dmitry Napolnov
Setting up for performance: Why Containers. Multiple projectors with different resolutions. Containers can also have UIs – be active. Using the Windows Placement Component. Customizing for performance. Prepare 4 Containers. Edge blending for multiple projectors. Stoner. Edge Blend.
TouchDesigner Workshop Resonate 2013 | Video 9 of 9 with Dmitry Napolnov
Setting up for mapping: Perspective from Camera. UV. Approaches to mapping. Deforming geometry (old fashioned way & time-consuming). Calculate camera to exactly match projector. CamSchnappr alternative (faster, new). 3D Render Picking. Play a LOT! Learn from Wiki, Forum, Video but personal exploration is most important. Let’s see the results!
Big thanks to the TouchDesigner team for taking time away from their busy schedules to host this workshop at Resonate. For more information on TouchDesigner, see derivative.ca