openFrameworks
Leave a comment

Secret Rhythms – Artificiel’s Three Pieces with Titles

As is always the case, this past summer’s edition of MUTEK Montreal offered a broad survey of audiovisual performance approaches and aesthetics. Quite notably, this included the return of artificiel to the A/Visions program across two projects; Jimmy Lakatos teamed up with the Mexican electronic musician Murcof to present work modelling the music of the cosmos, while Alexandre Burton and Julien Roy (the other two-thirds of the group) presented Three Pieces with Titles, a new piece commissioned to premiere in Montreal – and now its second iteration will be presented at MUTEK.MX this Saturday. In the past artificiel have used a zapping Tesla coil, flashing light bulbs, and the manipulation of a Rubik’s Cube to generate sound and image, here they return to the computer vision-driven setup of that latter projects and discover the world of secret rhythms within an eclectic collection of objects.

True to form, Three Pieces With Titles contains three musical movements and each is controlled by the manipulation of distinct objects under the watchful gaze of a custom camera rig. In the opening movement, a score of tense violin and alto bow strikes is accented and processed based on the placement of archival photographs. Documenting the birth of the Manhattan Project and related nuclear experiments, the photos depict grinning scientists and foreboding test sites; each one has been unceremoniously ‘tagged’ with a marker for optimal recognition by the computer vision system. Arranged as a tryptich, photos are pulled in and out of the camera’s field of view on an underlying light table. Each subtle movement of an image triggers new sounds, effects, and alters the mix. Imgaes are added, removed, swapped-out, tilted, tossed aside – the duo’s fumbling hands are part of the show. “The project returns to the idea of using an apparatus live on stage in a manner that is analogous to acoustic instruments,” says Burton of the notable presence of their hands within the work. In an era of immaterial software and figures onstage hunched over laptops – it makes sense. “One of the analogies we want to maintain is the visible readability of the relation between gestures and results.”

“One of the analogies we want to maintain is the visible readability of the relation between gestures and results.”

The second movement is a musique concrète megamix constructed on an ad hoc step sequencer. Starting simple with left-to-right loops, the duo quickly constructs a palette of disjointed longer and shorter spans with a playhead zooming across the screen plinking out beats and notes each time it strikes a disc that has been manually placed within the scene. “The current version of the step sequencer slices the view in seven tracks, each can be divided up to sixteen times, and a speed multiplier can be applied to each track’s individual playhead – creating asynchronous polyrhythmic patterns. It’s a binarisation of the image, updated every camera frame so if a slot is mostly black it’s ON and otherwise it’s OFF. The time is controlled by a sound file playing in Ableton Live – essentially a phasor going from zero to one over the period we want to be the ‘base sixteen beats’ … the playhead signal is ‘querying’ the image analysis which triggers back MIDI note messages.”

↑ Cause and effect: a few notes are played on a melodica under the camera and the audiovisual riff is sampled for deconstruction

Again, artificiel’s hands are hard at work, rearranging the spartan rhythms that evoke both vintage Matmos and clonky proto-techno. At other moments traditional instruments are translated into visual music. “Thus far we’ve used the ukulele and the melodica as they are children’s instruments and small enough to fit under the camera.” In both cases the resulting visual music playfully deconstructs associations of what the source instruments look like and sound like. “We also tried bigger instruments but we’re not comfortable ‘performing’ on them – we don’t want to be tourists on the guitar! But ‘tourists’ on the ukelele, under the microscope is okay. Our real work is what we end up doing with the samples.”

Three Pieces With Titles use openFrameworks for processing camera input and driving video, and video elements are synchronized to trigger phrases and events within a Csound orchestra. The computer vision system measures where the centre and edges of the objects under-camera are located with a high degree of precision; it also notes colour – certain colours can be ignored and variants of objects with different colours yield distinct musical outcomes (e.g. the discs in the ad-hoc step sequencing). It draws on the ofxDecklink, ofxOpenCv, and ofxPostProcessing libraries. “Decklink is necessary to import the images and the heavy lifting is done by our own ofxSampler, which is a slave to the actual sampler which is built in Csound – it handles the timing and handling of events, phasors, etc. Csound is controlled via OSC from a Max GUI running in Ableton Live, which maintains the sequencing of formal parameters.”

Given the legs Artificiel’s performance POWEr had, Three Pieces With Titles will undoubtedly tour for a few years. With its meta jokes about composition and instrumentation it’s definitely subtler than some of the hyper performative gestural works and outright spectacle that are centre stage at festivals these days, and the lengths to which the duo have gone to build a system that is coherent and lends itself to improvisation is clear. Burton shares the logic driving Three Pieces With Titles’ three pieces thus far has been ‘the image controls the sound,’ ‘the sound is the image becomes data,’ ‘the gesture makes the sound the image’ – we look forward to seeing/hearing how those mandates evolve in 2018.

artificiel | MUTEK

Photos: Bruno Destombes