Even when judged against its usual high standards, MUTEK 2012 was a stellar year for AV performance. In addition to the A/Visions program, there were a number of other noteworthy shows, screenings and installations that reinforced the prominence of real time graphics and ‘cinematic ambience’ across the festival. Whether it was Jeff Mills’ figure poised over his 909 against the backdrop of a massive projection of the moon, Robert Henke and Tarik Barri’s audiovisual interpretation of the recent ethereal-but-groovy Monolake LP Ghosts or the immersion and impeccable curation of Recombinant Media Labs’ CineChamber – multimedia collaboration was everywhere. One of highlights of the festival was undoubtedly the I Dream of Wires modular synthesizer showcase that took place in the Satosphere, a huge dome hardwired for 3D projections that is permanently installed atop the Société des arts technologies (SAT). While the whole evening was great (Clark’s set felt like partying in a near-future rap video), the set by veteran American producer Keith Fullerton Whitman and visualist duo Zef & Santo was delightfully weird. Playing out as some kind of demented 8-bit hall of mirrors, Zef & Santo’s glitched-out geometric machinations perfectly complimented Whitman’s analog improvisation. I recently caught up with Zef & Santo to learn more about their intricate 3D projection workflow.
What are the challenges in working in 3D versus traditional projection contexts?
Zef: Performing visuals in the Satosphère has its own particular challenges. It’s a rather unique place for visuals, a full 360º x 210º degree dome surface which completely envelops the audience. A special approach is needed for the visuals because feeding standard video resolutions directly into the dome severely distorts the content. Another special challenge is the high resolution required for the dome, needing a 2240x2240px spherically distorted video feed.
The potential of this permanent installation really shines when the content is created as a 3D environment since the dome can accurately represent this environment to the audience. Using this technique, it is possible for the audience to actually lose perception of the dome’s surface, having it replaced by the perceived effect of being within an alternate 3D virtual space.
One challenge with working in 3D at such high resolutions is that render times can be extremely long, this is where using a real-time rendering engine is extremely useful. Thanks to the fisheye for Unity 3D project it is possible to output the spherical map needed for the dome directly from within the Unity game engine in real-time, bypassing the need for any offline rendering.
Santo: We love the mix between hi-fi and low-fi, it gives our output a more human feel and the sense that it is being created in the moment versus in a studio, rendered for three weeks by some huge renderfarm.
In one of the first emails we exchanged you mentioned a “complex chain” of devices and software for digital and analog signal processing. Could you describe your kit and general workflow?
Z: We are two visual artists working on two separate machines in tandem to create the final output. Santo creates live visuals using actual objects and lights captured by a real-world camera and fed into Resolume where he processes the visuals even further. His output is then captured into VDMX on my machine and piped into Unity 3D using Syphon to be used as textures on the objects within the scene. VDMX is also used to parse incoming OSC messages received from a Lemur touch interface enabling full control of key parameters of the 3D environment.
S: We like to change the workflow between shows because we find doing the same thing twice extremely boring. As mentioned above, I had a laptop with some sound reactive content feeding into a small analog TV being filmed by an HD camera. This was captured on a PC and processed in Resolume. This live content was placed into a multichannel content matrix (within Resolume) in order to be able to mix different textures with different elements. This matrix of content is sent in 1080p back to Zef to texturize various components (background, main object, secondary objects etc.).
Z: This is a good moment to note my gratitude to the invaluable work done by Paul Bourke for his Unity 3D fisheye project, Anton Marini Vade and Tom Butterworth for developing Syphon and Brian Chasalow for developing and maintaining the Unity 3D plugin for Syphon.
I’d love you to hear you describe your accompaniment of Keith Fullerton Whitman’s music. What was going on conceptually in that collaboration and how do you feel it worked out?
S: Keith Fullterton Whitman’s music was very interesting for performing our kind visuals to because it is totally improvised and completely analog (no bullshit!) and I think that each show is really different depending of what happens within the space. That’s also how we see our work: emerging from a particular moment. When we started the show we were supposed to be receiving a live video feed from Keith but that never happened. Keith later said “I saw that you guys didn’t need it [the feed].” I think that Keith’s music really inspired us and that we worked together to create an experimental journey into an analog world, both musically and visually.
Z: We wanted to have a prominent analog feel for this particular show seeing given Whitman’s signal flow. Using our camera to capture audio-responsive glitches generated by a cathode ray tube TV as well as not clearing the depth buffer in Unity went quite far in giving us some nice feedback effects.
See also: Zef & Santo’s visuals for Pole, from an April SAT show.
- Unpacking the CineChamber – Naut Humon on Nomadic AV Performance Over the last two months CAN has drawn attention to many of the integrated AV performance projects featured at MUTEK 2012. This has taken some time as there was no shortage of material to choose from. Occupying quite a prominent place within this year's MUTEK programming was the CineChamber, an intimate, "live cinema performance environment" featuring surround sound and panoramic projection surfaces. Originally developed as a state of the art theatre in the San Francisco headquarters of Recombinant Media Labs, in recent years the CineChamber has operated as a nomadic venue, popping up at recent editions of the Club Transmediale and Cynetart festivals. Drawing on a decade of archived work that has either been developed for or adapted to be played within the space, Recombinant Media Labs has cultivated a deep catalogue of experimental media by artists like Ryoichi Kurokawa, Semiconductor and Francisco Lopez. At MUTEK the CineChamber featured a rotation of five hour long 'modules' of programmed content as well as live performances by Biosphere, Robin Fox and Artificiel. CineChamber artistic director Naut Humon has graced CAN with the following extended interview wherein he details his extensive history in fusing sound with moving image and reveals much of the backstory behind the CineChamber. At MUTEK you introduced the CineChamber with a slideshow documenting approximately three decades of experimental event production. Can you provide some information on the origins of Recombinant Media Labs and identify some of the threads that have run throughout the history of the project? I have been the Artistic Director of International Operations for Recombinant Media Labs (RML), Asphodel Records and AV curator for select portions of the annual Prix ARS Electronica Festival in Austria where I have also helped coordinate their Digital Music’s category for ten years between the 90s and 2000s. Having performed in the past with the avant-garde music group Rhythm & Noise and now about to develop a new sound and vision collective around the CineChamber format, there is a wide range of viewer involvement and constructed environs that build a bridge between these cross media collaborators from several decades. RML emerged from my lifetime of performance exploration and intermedia cooperations. I started my professional theatre career at age seven in my hometown of Seattle, enacting character roles in various plays. By high school, I had picked up numerous leading roles with established companies and began to direct my own versions of 1984, Who’s Afraid of Virginia Wolfe, and Marat/Sade amongst other productions. Subsequently I felt constrained by proscenium paradigms and prewritten scripts, and sought to transform the dominion of traditional theatre in provocative, personalized ways. After migrating to the San Francisco Bay Area in the early seventies, me and my young associates presented a series of audience mobilization and abduction events, transporting the observer through various indoor & outdoor sites. Stationary habitats consisted of urban physicalities: pitch-black deprivation environments became weather chambers, driven by gale-force wind machines circulating rain, smoke, scents, and climate change onto a roaming and restlessly curious crowd. Elsewhere in other circumstances, individual spectators were placed into packing crates on an assembly line and processed via conveyer belt, pulley lifts, and cranes in a custom assembled factory setting. At larger shows, audience members were bussed separately to multiple locations across an entire county, from dusk until dawn. Each stop depicted a unique locale; a giant alpine dam tunnel, railroad bunkers, open air-rock formation bedrooms, graveyard storage vaults, or longboat excursions across an inlet bay. During these travels, the process of moving was interrupted by staging ‘destabilized’ media occurrences— vibrating sound montages coupled with fragmented visual narratives designed to repurpose the visitor’s frame of reference from the prevailing performance paradigms of the day. Further into the 70s, these transmigratory happenings moved to more exotic locales in desert and mountain locations. Emphasizing a strong musical foundation with tactile instrumentation and audio/ video technologies heralded a fresh focus on the physicalization of form. By the 80s, our constituency had built up the warehouse known as the Compound where the experimental group Rhythm and Noise was spawned, initiating many of the core elements that led to forming the globally distributed Asphodel recording label which fostered projects ranging from DVD chamber orchestra renditions of Lou Reed’s Metal Machine Music (with Lou Reed playing with Zeitkratzer) to the eminently compelling CDs of Mix Master Mike, Diamanda Galas, Rhythm & Sound, Tipsy, Iannis Xenakis and many others. Rhythm & Noise live in San Francisco The RML story itself began in the early nineties when its predecessor, Sound Traffic Control, was invited to Tokyo to world premiere its “dub dashboard treatment plant”. While preparing for this engagement in 1991, I portrayed the performance as a “sonic airport where various musical cargos land, taxi, and take off from an imaginary runway inhabited by ‘audience’ passengers amidst the dynamic audio trajectories.” Japan’s Panasonic Corporation took this metaphor literally. According to RML's television terminal design, they constructed a large air traffic control tower from corroded metal that housed multiple battered video radar screens and searchlights over a bed of fog, in an expansive room of 750 floor-to-ceiling loudspeaker columns. At these shows, Recombinant’s omni-directional media system began to emerge. Upon returning to the States, I set out to construct a nomadic model of what had occurred in Japan. Throughout the rest of the decade, a succession of Recombinant spectacles were mounted at festivals, and customized for special locations around the world. During this time, its sponsoring partner Asphodel LTD., began releasing CDs for artists selected from these events. One of the most renowned Recombinant presentations occurred at the 20th anniversary year of the Ars Electronica Festival on September 9, 1999 in Linz, Austria. Playing on the 9/9/99 calendar date, Recombinant designated Recombinant 9.9.99, a nine hour, nine minute and nine second timeline format, consisting of 61 timed modules created by 33 participants. After viewing connective events in the outdoors and adjacent auditoriums, spectators entered into a larger hall surrounded by stages on all sides occupied by Laptop orchestras, turntable assemblies, sizeable steel sculptures and an encompassing matrix of screens, lights, and loudspeakers. This technological configuration provided a springboard for a substantive, content-driven array of artists and film/ video works as each nine minute performance combination segment unfolded. After appearances in Asia, America and Europe, myself and associates drafted the blueprint for two fixed-location media laboratories that included a base for the Asphodel imprint, and a comprehensive studio complex. Within this structure, the RML consolidation was substantiated as a culturally coherent enterprise, hosting a vast range of visual artists, musicians, engineers, designers, curators, organizers, technologists, educators and theorists from around the world. The fundamental thread running through the cross media fabric that RML has woven originates from its music driven foundation of immersive airborne surround sound formations as they are gradually applied to its multi-channel image and display amalgamations which are adding screens to where the speaker locations are custom configured when they encircle the listener in 5.1 or larger channel counts. The nomadic CineChamber utilizes an 8.8.2 loudspeaker positioning that punctuates the current ten screen rectangular audience enclosure. CineChamber setup, Club Transmediale 2011, photo: Barry Threw So the current configuration of the CineChamber is 8.8.2 sound with a ten screen enclosure, how many square metres is that? Could you describe the configuration of key past iterations of the space? I’m curious how you came to this current arrangement – presumably there was some experimenting with the area and sound. Length = 12 meters and width = 8 meters. 12x8 = 96 square meters. Our US system was 36 by 24 feet – the new one is larger. We’ve experimented with cyclorama circle shapes with 12 screens in a circle and TouchDesigner can calibrate for domes and other shape shifter surface canvas configurations. The previous iteration of the CineChamber was fixed in the RML’s San Fransisco Headquarters. What made you decide to make the project mobile and what logistical challenges has that presented? After a number of years of blending live vs. programmed events of symphonic scale and substance, RML and Asphodel joined forces to occupy two fixed-location warehouses in San Francisco to headquarter a performance-residency center for developing their panoramic, simulation style of surround cinema. After working several years in a fixed location within the central city of San Francisco, a organization with a focus on mobile setups was founded. This was done in order to formulate an answer to the increasing requests for international presentations at museums and festivals. Over the years RML has built up a selected resource of artworks and it was time to highlight this body of AV pieces out to the world. Freed from the constraints of a geographically anchored construction, RML’s nomadic approach is also able to offer residencies together with organizations in metropolitan locations. This vanguard hybrid media platform encompasses many presentational options to potential partners, (co)-producers and curators. So instead of artists and audiences having to come to California to work and perform at RML's facility, it was time that RML return to the international circuit and bring the CineChamber experience directly to the cities and locales where many of these composers in residence came from. Logistically, the nomadic oriented system faces the same challenges as many road worthy stage acts. There is currently a North American CineChamber edition which got underway at MUTEK, a CineChamber test-lab at the University in San Diego (UCSD) and another portable setup has been touring Europe – our staff and artists are located in different cities all around the globe. The RML tech director Vance Galloway resides in Seattle and is in the same role for the citywide Decibel electronic arts festival that takes place there. Barry Threw is the chief programming architect on the CineChamber and works with Obscura Digital out of San Francisco. Over in Europe, Egbert Mittelstädt (who sometimes does visuals for Biosphere) is operations coordinator for the european edition of the CineChamber. In a video documenting RML’s participation in the 2011 edition of the Cynetart Festival you describe the CineChamber as a “live cinema performance environment” and an “audiovisual instrument”. VJ culture has definitely imprinted most audiences with ideas and expectations about ‘live cinema’ but the idea of a space being used as an instrument might be a bit abstract for some potential viewers/listeners. Could you describe what you mean by this phrase and maybe cite some examples from material you’ve programmed? The paradigm of the CineChamber activates an architectural discourse around various approaches toward utilizing the space itself and being able to adapt the large cubicle towards customized applications. Blurring the boundaries between temporal one night appearances', and capturing the experiential archived totality of a whole 360-degree concert for todays & future audiences to relive or discover anew is only part of RML’s concept of Experiential Engineering. This kind of modus operandi affects a philosophy of methods, pedagogies, and systems of production for preserving and presenting precocious content that can be built upon and 'experienced' by chambergoers for generations to come. By viewing the CineChamber room as an 'instrument', RML invites various artists to come in for a given residency period and be able to 'play' and navigate the space according to how they wish to map, manipulate and steer the interior elements as a whole apparatus. By uniting the chamber orchestra of loudspeakers, screens, projections, sensors, tools, atmospheres and optional rumble floors as physical components of a central show control engine, the full CineChamber becomes the compositional hub that a group or solo artist can recombine the architectural aspects of a live cinematic environment that can be previsualized, rendered and played back accordingly with or without live musicians or projectionists present This type of synesthetic habitat formed an ongoing basis for a plethora of real time AV engagements involving groups of humans and their devices. When these performers do occasionally slip out of sight, temporarily leaving the attendees to their own 'devices' what was then the experiential implication of a non-human, machine-driven spectacle? From lip sync to human sync where does the flesh and blood make the difference in our experience of these personality-propelled portrayals? As our hand-operated species stands up for their side we would say a lot and that we still care a lot in a transformative age where our bio-technological inventions are gradually changing the definition of what all this so-called 'humanness' in the theatre is all about. So what is real and what is not? Is it live or is it in memory? Is it hot or is it cold? How does this still really even matter in 2012? So even with the CineChamber screening format the audio soundtracks are all mixed live by me as an active music translator of the composers wishes and instructions unless the performer is there on visit during the festival renditions of their residency works. The ‘active music translation’ coupled with mobility makes me think of the sound system culture that flourished in Kingston, Jamaica in the 1960s and 70s. Do you see the CineChamber extending out of that lineage and subsequent global DJ/remix culture? Since we have utilized so many multi speaker setups over the years the sound system cultures from both Jamaica and those others from the 50s with Iannis Xenakis and Karlheinz Stockhausen etc. were all antecedents of the chosen CineChamber model. At our fixed "CineStation" locations, the CineChamber has sometimes been stationed with a 16.8.2 system with 8 more channels being elevated above the others below in a cubic formation; one channel in each of the four corners and one channel halfway between each of the four corners top and bottom. The traveling system is 8.8.2 with the occasional rumble vibrational floor where the venue permits. So in a way adding the multi channel dub attitude incentive to the optical side of things is a prevailing thread to the CineChamber undertaking and how the long term audio iterations inform the polyphonic panoramic compositional dynamic. There is enough of a CineChamber ‘back catalogue’ of work prepared for the space to fill out several programs. Could you describe some of the first works that were developed for the space? Additionally, I’m curious if there are any trends you’ve noted in the way that artists/musicians go about composing for the space? For the transportable CineChamber unit we first developed works from live concerts in the downtown San Francisco facility. Artists from Europe and elsewhere would be invited for varying residency periods to develop their custom appearances and ways to repurpose older and more recent transmutations of their visual ideas across the panoramic spectrum of screening area. Early RML visitors in the 2000s included Richard Devine, Daniel Menche with Zbigniew Karkowski, Thomas Brinkmann, Florian Hecker and Maryanne Amacher, all of whom did audio surround live performance residency stints and were utilizing behaviour, gravity and space mappings for various versions of RML’s extended musical system. For a pivotal example of an early RML module process and outline of a Maryanne Amacher piece, entitled Plaything, started in the nineties. Here is a description: Over a period of several residencies Maryanne Amacher had been working on a 12 channel composition for the Recombinant Media Labs audio visual surround environment now known as the CineChamber in San Francisco. For Maryanne the double array of 2 X 6 channel speaker placement that was in the RML compound represented a more prevailing style of configuration found in many 5.1 or higher rectangular studios and auditoriums. As her ideas of structure borne sound & speaker locations placed the audio frequency emitter sources in architecturally exotic destinations in the rooms and passageways of buildings during a large majority of her performances, the conventionally located surround reference monitor PA at RML was truly an unusual setting atypical of her preferred real time concert diffusion geographies. This is one of several occurrences where Amacher utilized a rather customary X-Y-Z corners and wall loudspeaker configuration to realize and listen to the progress of the piece intended for this type of “AIRBORNE” style playback in venues where RML would install slight channel count variations on provided audio and image arrays in festivals and theatre spaces. Recombinant Media Labs See also: CineChamber chief technician/software architect Barry Threw's […]
- Virtual Reality Art Show by Geoffrey Lillemon + Random Studio The Nail Polish Inferno is a "Virtual Reality art show accessed through the Oculus Rift". Developed with Unity in combination with 3D Studio Max, Geoffrey once again takes us on a virtual journey of the uncomfortable and […]
- Just another day at the lab: MUTEK A/Visions 2012 photo: unknown8bit For the last 13 years the end of May has signalled a global convergence of electronic music enthusiasts in Montreal for a week of performance, networking and revelry. It is no small feat that the MUTEK festival has grown far beyond its humble roots as an 'inside baseball' showcase of the rosters of boutique experimental labels into a robust platform for the promotion of techno, house and more experimental fair with widespread popular appeal. While MUTEK may flog the fact that it has become a bonafide tourist attraction (it has drawn crowds of more than 10,0000 in recent years), don't let the rhetoric fool you – the festival still has very sharp teeth when it comes to adventurous programming. Nowhere is this fact more clear than within the A/Visions stream, an event-series dedicated to imaginative, integrated audiovisual performances that was launched in 2006 and has consistently served as the locus of innovation within the festival. So, how did the 2012 edition of A/Visions measure up? Quite excellently, and in looking beyond some minor programming hiccups, this was clearly the strongest showing of AV material ever featured at MUTEK. Even more encouragingly, this year marked a welcome expansion of the AV programming into other event streams yielding an almost overwhelming amount of shows and screenings to choose from. The following review is an attempt to identify some of the prominent themes in the A/Visions program and point CAN readers at some of the more noteworthy projects/performances featured this year. Andrew Pekler & Jan Jelinek play Ursula Bogner, photo: unknown8bit Some of the most compelling work at A/Visions 2012 welded engaging musical performances to the presentation of historical and aesthetic mythologies. Berlin's Andrew Pekler and Jan Jelinek crafted droning, transcendent soundscapes out of the compositions of the late Ursula Bogner, an obscure German electronic musician active in the 70s and 80s who was headlong into fringe science. Drawing on the idiosyncratic legacy of their might-be-fake-but-completely-plausible muse, the duo positioned themselves perpendicular to the crowd with a camera clamped to a mic stand filming their clinical interactions with their analogue kits. This live feed sat alongside a deadpan slideshow that cycled through various biographical photographs of Bogner, her family and celestial diagrams. The sight of Pekler and Jelinek cooly constructing space jazz against a backdrop of musty photographs proved beyond a doubt that the world is ready for a hybrid Wes Anderson/Sun Ra multimedia aesthetic. The key takeaway from this performance: the mytho-biography of Ursula Bogner is the most compelling cosmology in electronic music since Drexciya. Not every performance at MUTEK trafficked in long-forgotten fringe musicians, we also saw some artful homages to 70s and 80s video and atomic warfare. Video artist Sabrina Ratté teamed up with Le Révélateur (aka Roger Tellier-Craig, formerly of Godspeed You! Black Emperor) to present a lo-fi, dreamy AV performance that wed the "chromatic aberrations" of abstract colour fields and landscapes with idyllic ambient. Ratté displayed mastery of a number of classic video transitions and effects to provide a retro, thoroughly authored series of vignettes that perfectly mirrored Tellier-Craig's compositions. Veteran producers Biosphere and Lustmord teamed up to present Trinity, a cinematic exploration of the landscape and technology surrounding the first detonation of a nuclear device in 1945. Like Jelinek and Pekler, the duo deployed a range of archival photography to stitch together a narrative of inference. Comprised of vignettes focusing on the texture of affected terrain, portraiture of the scientists and military personal involved in the detonation and some classic nerdcore fetishization of (military) gadgetry and infrastructure. The ominous set waxed and waned between the blistering intensity of a death march and more restrained atmospherics, and while it felt unfocused at times, so be it, as it was undeniably live. The introduction to Biosphere and Lustmord's collaboration featured long meandering pans across photographs of the American Southwest as a lead-in to a creative interpretation of one of the darker chapters of 20th century physics. This was not exactly a surprise as landscape is one of the enduring tropes at A/Visions as there is a longstanding tradition of exploring place through film and experimental music. Two performances in particular, were delivered as extended meditations on the narrative potential of landscape, Roly Porter and MFO's Akheron Fall and Nelly-Eve Rajotte's cinematic scan of the American (and Canadian) West. The former fixated on the notion of 'the dark forest' as a setting for the majesty of Porter's industrial-strength compositions and the latter mixed field recordings with Spaghetti Western samples to score a rolling, split screen landscape montage accented with a dash of cowboy datamoshing. Jan Jelink and Andrew Pekler's slide deck space jazz notwithstanding, I'd say the most successful A/Visions acts this year interrogated the space of performance. Robin Fox's transformed the Ludger-Duvernay Theatre at Monument-National into a geometric playground where, in the purest sense possible, sound was used an as instrument to modulate the sweeps, scans and scribbles of a centrally located laser. Fox obviously tuned his performance to the dimensions of the venue and his array of beams carved up the thick clouds of smoke that wafted over the audience. The resulting experience was visceral and volumetric and the audience—blasted with light—was fully immersed in Fox's arena. The Australian artist received a rabid ovation for both the overall tightness of his performance and for zapping the audience out of passive spectatorship. Another noteworthy performance was Kode9, MFO and Ms. Haptic's Her Ghost, a thoroughly moiré-d rethinking of Chris Marker's seminal 1962 experimental film La jetée. Using stills from the original, this reconstruction took liberties with the fabled narrative of 'temporally challenged' ill-fated lovers and dove headlong into tweaking the look, feel and sequence of the underlying time travel, determinism and dystopian squalor. Better yet, the cinematic redux was 'performed' with Kode9 and company playing from the audience which was a very convincing reminder that, more often than not, the artists really aren't needed on stage in these contexts. The gritty processed photography and illustration, Kode9's rumbling sound design and the straight up gravitas of Ms. Haptic's live narration made for a super-engaging performance that actually spoke to film as a medium rather than simply appropriating stylistic conventions from it. There were several other acts on the A/Visions program, notably the final show in the series which featured Les Momies de Palerme and a collaborative jam by Tim Hecker and Stephen O'Malley – occurring within the cavernous interior of St. James United Church. Earlier in the week, Pierre Bastien and Espen Sommers showcased their eclectic Electric Folkways project, which leverages a table full of custom contraptions as the basis of an improvisational arsenal and Ben Shemie presented a live mix of Transmission 1, a work simultaneously broadcast across two FM radio stations. At the beginning of this review I mentioned that the AV programming at MUTEK seems to really be influencing other streams, making this year particularly delightful for attendees 'turned on' to art and design. Outside of the material reviewed in this post, this year saw a projection-bolstered Jeff Mills mix that convincingly mapped out the relationship between techno and speculative fiction, an incredible 3D projections meets modular-synths showcase at the Satosphere (a dome perched atop the Société des arts technologiques) and a robust experimental media program at Recombinant Media Labs' CineChamber 'mobile immersive arts facility' (which will be featured on CAN soon). Given that MUTEK is now firmly within its second decade, there is definitely a desire to see the experimentation and cross-contamination across programs that has defined recent editions of the festival continue and amplify. Here's hoping that A/Visions 2013 yields more media archeology, more provocations and more spaces for exploration and delight. MUTEK Sabrina Ratté, Blue Nuit Perre Bastien & Espen Sommer Eide's Electric Folkways, photo: unknown8bit A/Visions 4, St. James United Church photo: unknown8bit Robin Fox's Laser Show, photo: unknown8bit Biosphere & Lustmord present Trinity, photo: […]
- Robert Henke ‘Lumière’ – Cutting the room with vectors and lasers As Robert Henke sets of on his tour with the new project Lumière, kicking off in NYC on the 10th May, we offer a little preview of what is to […]
- Avouching A/Visions at MUTEK 2011 [Events, Sound] [AntiVJ's Simon Geilfus and Murcof at A/Visions 2 / photo: basic_sounds] Having just completed its twelfth run, Montreal's MUTEK festival continues to cultivate the the substantial niche it has carved out for itself on the global media arts circuit. In addition to a storied history of showcasing emerging and established electronic musicians of all stripes, MUTEK has also acted as an r&d lab for exploring the possibilities of integrated audiovisual performance. In 2005, a programming stream dedicated to presenting bleeding edge collaborations between musicians and visual designers entitled A/Visions was brought into the fold to showcase innovative projects like artificiel's cubing and Marc Leclair & Gabriel Coutu-Dumont's 5mm. A/Visions has matured so rapidly that by 2008 this supplementary programming was consistently eclipsing 'big room' headliners and—at least as many MUTEK regulars were concerned—functioning as the locus of innovation within the yearly gathering. The 2009 and 2010 AV performance programming upped the ante even more and the expectations for both experimentation and production design were very high going into the this year's edition of the proceedings. This post presents an overview of and reflection on material featured at A/Visions two weeks ago in Montreal. Electroacoustic composer Alain Thibault and visual designer Yan Breuleux have been working together as Purform for almost 15 years. For MUTEK the duo presented White Box, a project dedicated to exploring "new forms of generating A/V compositions in real time." As evidenced by the teaser video above, the performance leveraged a massive three screen projection surface as a canvas for exploring dense monochromatic meshes and emergent moiré patterns. Characterized by coarse granular synthesis and dynamic, clinical pattern studies, the set was undeniably polished – perhaps pristine to a fault. Compared to the subsequent rumbling bass-scapes of Emptyset and the cataclysmic improvised mayhem that Mika Vainio cooked up in the darkness, White Box offered a glimpse into a stark formalist universe that could only emerge from such a longstanding collaboration. Within about 45 seconds of beginning their performance at the SAT the British duo Sculpture had already confirmed their status as the wildcard artists at MUTEK 2011. Sound artist Dan Hayhurst and animator Reuben Sutherland specialize in crafting dense, plunderphonic soundscapes complimented by live video of custom-made zoetropic picture discs. Their performance married reel-to-reel tomfoolery with turntable centric digital video that was projected onto a horseshoe-shaped configuration of screens lining the perimeter of the space. This arrangement was intentionally overwhelming and many audience members were visibly dazed by the combination of Hayhurst scrubbing through his tape loop inventory and Sutherland's reconfiguration of the wheels of steel as a psychedelic movie machine. The set was a gloriously orchestrated cacophony – media archeology for the MIDI controller set and a refreshing reminder that a virtuosic back-to-basics approach to animation is capable of trumping any graphics library. Fernando Corona (aka Murcof) and Simon Geilfus of AntiVJ have been collaborating for approximately two years and the duo presented the fruits of their (iterative) labour at the second A/Visions event. The embed above really does not do this work justice and the creative partnership essentially 'builds a universe' around Murcof's brooding, orchestral LP Cosmos and some more recent material. AntiVJ's Joanie Lemercier and Nicolas Boritch describe the work as "being rooted in a 2009 residency in Bristol" where the artists had the time to build an "emergent" performance workflow "from the ground up". Riffing on the geometries and organizational logic of cosmology, biological systems and the scattershot luminosity of a dense weave of light rays, the set was captivating and deservedly received a thunderous response. It should also be noted that AntiVJ developed a thoughtful solution to the perennial "where do we put the performers?" problem by projecting the video on a semi-transparent mesh scrim that hung in front of Corona and Geilfus, downplaying their visual presence and also creating the illusion that the animation is floating in space rather than dancing across a "standard white screen". One particularly riveting sequence played out as if the audience were drifting through a 3D field of detritus that pulsated in sync with Corona's drones, the shading of this 'space junk' was incredible and justifies comparisons to some of Lebbeus Woods' wilder moments. A veteran of the inaugural MUTEK lineup, Seth Horvitz is amongst a handful of artists including Atom Heart and Carsten Nicolai whose experimental practices have remained important references to the evolution of the festival over the last decade. Horvitz recently completed his MFA at Mills College and essentially presented his thesis research Eight Studies for Automatic Piano on intensely programmed scores for the Yamaha DC7 Mark III Disklavier. In perhaps the best moment of theatre of the entire festival, a besuited Horvitz began his performance by strolling across stage to turn on his piano and then disappeared into the shadows, not to be seen again. The work presents kind of a piano endgame that tested the perception of the audience while—given the scope of MUTEK—offering a timely examination of precision and 'programming'. An excerpt from the 'listener's guide' (PDF) for the LP release of the project on LINE where Horvitz discusses the notion of hand-made algorithms: "I have been told that my music is algorithmic, although I don’t really think of it that way. I don’t use any math other than simple addition, subtraction, multiplication and division. I copy something (often a repeating figure), paste it next to itself, and then change it a little. Then I do it again and again, changing it by the same amount each time, and listening all the while. If it doesn’t sound good, I might start over. Or I might copy half of all the copies and put them somewhere else, change that a little, then repeat the process again and again… I avoid using equations, because I never want the music to get too far away from my ear." The above video demonstrates how the dense melodies literally wash over the keyboard while the projections offer a rudimentary visualization of the complexity of these pattern studies. Eight Studies for Automatic Piano was a treat to experience live, and there was something quite amazing about watching a precisely calibrated automaton work its magic in a concert hall setting. The above smattering of teaser videos clearly doesn't do A/Visions 2011 justice but these taste tests certainly verify the innovation and diversity of the work programmed this year. For the sake of brevity this review did not touch on Tristan Perich's surprisingly moving rendition of 1-Bit Symphony, a severe prop-driven performance piece by Women With Kitchen Appliances and an atmospheric meditation on macro photography by Comaduster – these projects are all worth looking into. Stepping back from A/Visions and considering the larger events at MUTEK, it is clear that the interplay of sound and image is becoming increasingly important to the direction of the festival; this year the spotlight shone on Richie Hawtin's LED cage (produced by Ali Demirel and the wizards at Derivative) and Amon Tobin showed up for his gig at Metropolis at the helm of a cubist megalith (it was hardly the Mothership, but I suppose it would do in a pinch). I greatly prefer the focus and discipline of the work I've described above, but one can't help but note that audience expectations and visual literacy are evolving rapidly. While my mind is still buzzing from this abundance of stimuli, I'm already starting to catch myself wondering what next year will yield. -- About the Author: Greg J. Smith a Toronto-based designer and researcher with interests in media theory and digital culture. Extending from a background in architecture, his research considers how contemporary information paradigms affect representational and spatial systems. Greg is a designer at Mission Specialist, blogs at Serial Consign and is a managing editor of the digital arts publication Vague Terrain. He currently teaches courses on information visualization, technology and urbanism in the CCIT program (University of Toronto – Mississauga/Sheridan […]
- Super Pikix [Caanoo, Sound] Super Pikix is a free VJ software for the portable console Caanoo created by Piklipita, duo of visual artists based in London, UK & Wroclaw, Poland. The Caanoo is an open-source, Linux-based handheld video game console and media player (wikipedia) priced at about US $150. The Caanoo device fits into your pocket, it includes a touch screen, powered by a rechargeable battery and in addition to SD card storage it also includes a composite video out (PAL and NTSC supported). The Super Pikix software supports two video layers to be blended together, allows you to save and load playlists, special effects and filters and has a video resolution output of 320x240 pixels. Piklipita is also the author of VJ-tools for Playstation 2, Nintendo GBA and soon to come PIKI900, VJ Software for Nokia N900. When it comes to the iOS platform, Piklipita have collaborated with Lingouf to create kovoclak and runkovo. Also below are attached some of Piklipita's VJ performances. [found […]
- Euphorie [Events, openFrameworks] We have already covered the work of 1024 collective but most recently comes the incredible Euphorie, a performance installation held in the La Lodge in February 2010, Paris. The project Euphorie was born from the desire to develop a project based on the interaction between video, sound, movement and accidents of life as a 40-minute theatrical performance tools powered by low-tech See video below. The team included Francois Wunschel + Fernando Favier, produced by 1024 with the support of ARCADI. Euphorie 1024 architecture is a company created by Pier SCHNEIDER and François WUNSCHEL, both co-founders of the EXYZT collective. 1024 architecture focuses on the interaction between body, space, sound, visual, low-tech and hi-tech, art and […]
Posted on: 20/07/2012
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG