In February 2010, the Red Bull Music Academy prompted Warp Records and Ninja Tune for a Soundclash on a 3D sound system, staged in the Loading Bay of the Royal Albert Hall. FIELD developed a generative real-time application especially for this event, which motion designers Quayola and Thomas Traum used to design and perform soundreactive visuals for the sets of Plaid, Clark, Mira Calix and many more.
The sound-reactive visuals span 5 screens in line with the immersive sound setup. 3D shapes rendered in realtime, animated textures and shaders, and mouse-controlled camera motion allowed for a huge range of styles and endless flights through an abstract universe. With the same tool Thomas Traum designed the title sequences to announce each artist in the Soundclash.
The Soundclash app is a standalone multi-window renderer that spans several screens with one huge canvas. The numerous scenes were prepared in a separate Editor application based on an EMF model, which allowed the team to control a long list of parameters in detail and save the settings in an XML file. For the show we then had hands free to control the Soundclash application with keys and mouse, using the Processing PeasyCam and the Minim library for the sound interaction. Thomas Traum and Quayola designed over 40 different scenes with textures and animated GLSL shaders with the Editor tool. The application was built in collaboration with Minivegas, commissioned by Nexus Productions and onedotzero. See full credits here.
You can see more images from the event on FIELD’s flickr.
- onedotzero – Call for Submissions! [Events] onedotzero are seeking innovative short films, installations, interactive work and live audiovisual performances to showcase at the BFI Southbank, London, UK, 10-14 November 2010. The five-day festival is the first stop on onedotzero's extensive worldwide network of events. It is a fantastic opportunity to get your work seen by a like-minded, connected and creative international community. This year, as part of our annual international tour of festivals and events, the team will be running a brand new category entitled 'code warriors'. They are looking for individuals and organizations that use code creatively in their production of short films, installation pieces or live audiovisual performances to submit their work, and the most imaginative examples will be chosen to be part of our events . For more information on how to submit, visit http://www.onedotzero.com/submissions Extended deadline for receiving entries is 16th july 2010, 5pm. Submit here This years festival premieres at the BFI Southbank, London, UK, 10-14 November 2010 before touring internationally. To keep track of all the events related to CAN, see our Event Calender or subscribe with iCal […]
- Dokfest Forest Identity [Processing] For the 26th edition of the Kassel Documentary Film and Video Festival, FIELD designed an identity based on the festival’s film submission database. Set in a thick and obscure forest like the wooded surroundings of Kassel, the colourful spheres form a sculptural representation of the programme – each of them represents a film, video, or installation work shown at the festival. A unique structure emerges from the forest when hundreds of these individual objects come together – like the festival brings together artists and visitors from all over the world, regional talent and established filmmakers, professionals and interested locals. Each film is represented by a sphere, with the size showing the length of the work. When two films coincide in all 3 parameters, meaning their spheres would sit in the same position, they cluster around this position like grapes on a vine. A generative colour palette assigns a unique shade to each represented work, which it keeps throughout all diagrams. The forest in the images was rendered using luxrender and took about 8 hours on a large amazon ec2 instance. Geometry was generated in a custom Processing application and then imported into Blender. See images below + make sure you visit field.io for more wonderful work by the London based studio. For more great Processing projects on CAN, see […]
- onedotzero // BFI // London [Processing, Events] Wieden and Kennedy were commissioned by onedotzero to create a visual identity and interactive installation for their upcoming "adventures in motion" event this September at BFI Southbank.Â Together with Karsten Schmidt (aka toxi), UK based computational designer they created a Processing application that collects conversations aroundÂ onedotzero from the web (Twitter, Flickr, Vimeo, Facebook and blogs) and generates the onedotzero identity.Â The software has also been used to generate print shots and the team is considering making it available for download and even open-sourcing it. The team is curently busy working on an interactive installation for the festival, to start very soon. The installation will be projected over 50m on the facade of BFI Southbank and also allow you to control the projections using your phone. More infrormation about onedotzero //Â 9th - 13th september 2009 // London //Â BFI More about Karsten Schmidt See also Karsten's set on Flickr onedotzero is a contemporary, digital arts organisation with a remit to promote innovation across all forms of moving image and motion arts. Activities encompass public events, artist + content development, publishing projects, education, production, creative direction, and related consultancy services. […]
- Geometry, Textures & Shaders with Processing – Tutorial From custom geometry to adding textures to 2D and 3D shapes, Amnon Owed shows you practical examples of a number of crucial building blocks for 2D/3D Processing […]
- onedotzero: 23–27 November, BFI London [Events] Photographed by James Medcraft This november onedotzero will once again take over British Film Institute (BFI) and showcase the diverse array of the very latest in visual creativity, via five days of expertly curated compilation screenings, feature films, exhibitions and installations, live audio-visual performances, bar events, education projects, presentations and panel discussions. Like last year, Creative Applications Network has once again joined forces with onedotzero and this year we are celebrating 10 years of Processing and trying to discover what the new type of ʻfimmakerʼ may be. Opensource, the Processing project encompasses a development environment and an online community promoting software literacy within visual arts. A specially curated highlights package of past and present works in motion in association with CreativeApplications.net will be shown. In addition, we'll be hosting a Q&A session. Karsten Schmidt will also be running a 2-day workshop titled "Joys of Processing". We would love you to join us. See here for information and tickets. BFI Southbank, throughout 23 – 27 November 2011. London. What else is going on at onedotzero: Projection Mapping: Trick or Treat? panel In association with AV:IN: 3D projection mapping is more than a recent trend across arts, entertainment and advertising; emerging as a must have stunt for brands and events worldwide. Is this a trendy trick or are there deeper creative values to this medium. We pull together a panel of top practitioners to debate its future. Sound and Vision talk In association with AV:IN: A panel and showcase exploring the idea of visual sound – traversing the experimental terrain between sound, space, image and form in new media, performance and art installation. Invited guests will be chaired by electronic musician/artist Robin Rimbaud aka Scanner, one of the boldest audio innovators of our time. new british talent 11 A captivating showcase of fresh work by the UKʼs brightest sparks and styles in animation and indie filmmaking today. From students and recent graduates, to independent studio upstarts this selection spans genres, from comedy to documentary, live-action and animation, from Britainʼs finest crop of new talent! + more.. (PDF Download) Festival identity by UVA Previously on CAN: onedotzero // BFI // London [Processing, Events]: Visual identity... onedotzero.app [Processing]: App used for onedotzero festival ... 'Decode' identity by @V_and_A + @onedotzero + @toxi gets […]
- Zef & Santo – 3D Real-Time Performance, Workflow and Collaboration Even when judged against its usual high standards, MUTEK 2012 was a stellar year for AV performance. In addition to the A/Visions program, there were a number of other noteworthy shows, screenings and installations that reinforced the prominence of real time graphics and 'cinematic ambience' across the festival. Whether it was Jeff Mills' figure poised over his 909 against the backdrop of a massive projection of the moon, Robert Henke and Tarik Barri's audiovisual interpretation of the recent ethereal-but-groovy Monolake LP Ghosts or the immersion and impeccable curation of Recombinant Media Labs' CineChamber – multimedia collaboration was everywhere. One of highlights of the festival was undoubtedly the I Dream of Wires modular synthesizer showcase that took place in the Satosphere, a huge dome hardwired for 3D projections that is permanently installed atop the Société des arts technologies (SAT). While the whole evening was great (Clark's set felt like partying in a near-future rap video), the set by veteran American producer Keith Fullerton Whitman and visualist duo Zef & Santo was delightfully weird. Playing out as some kind of demented 8-bit hall of mirrors, Zef & Santo's glitched-out geometric machinations perfectly complimented Whitman's analog improvisation. I recently caught up with Zef & Santo to learn more about their intricate 3D projection workflow. What are the challenges in working in 3D versus traditional projection contexts? Zef: Performing visuals in the Satosphère has its own particular challenges. It's a rather unique place for visuals, a full 360º x 210º degree dome surface which completely envelops the audience. A special approach is needed for the visuals because feeding standard video resolutions directly into the dome severely distorts the content. Another special challenge is the high resolution required for the dome, needing a 2240x2240px spherically distorted video feed. The potential of this permanent installation really shines when the content is created as a 3D environment since the dome can accurately represent this environment to the audience. Using this technique, it is possible for the audience to actually lose perception of the dome's surface, having it replaced by the perceived effect of being within an alternate 3D virtual space. One challenge with working in 3D at such high resolutions is that render times can be extremely long, this is where using a real-time rendering engine is extremely useful. Thanks to the fisheye for Unity 3D project it is possible to output the spherical map needed for the dome directly from within the Unity game engine in real-time, bypassing the need for any offline rendering. Santo: We love the mix between hi-fi and low-fi, it gives our output a more human feel and the sense that it is being created in the moment versus in a studio, rendered for three weeks by some huge renderfarm. In one of the first emails we exchanged you mentioned a "complex chain" of devices and software for digital and analog signal processing. Could you describe your kit and general workflow? Z: We are two visual artists working on two separate machines in tandem to create the final output. Santo creates live visuals using actual objects and lights captured by a real-world camera and fed into Resolume where he processes the visuals even further. His output is then captured into VDMX on my machine and piped into Unity 3D using Syphon to be used as textures on the objects within the scene. VDMX is also used to parse incoming OSC messages received from a Lemur touch interface enabling full control of key parameters of the 3D environment. S: We like to change the workflow between shows because we find doing the same thing twice extremely boring. As mentioned above, I had a laptop with some sound reactive content feeding into a small analog TV being filmed by an HD camera. This was captured on a PC and processed in Resolume. This live content was placed into a multichannel content matrix (within Resolume) in order to be able to mix different textures with different elements. This matrix of content is sent in 1080p back to Zef to texturize various components (background, main object, secondary objects etc.). Z: This is a good moment to note my gratitude to the invaluable work done by Paul Bourke for his Unity 3D fisheye project, Anton Marini Vade and Tom Butterworth for developing Syphon and Brian Chasalow for developing and maintaining the Unity 3D plugin for Syphon. I'd love you to hear you describe your accompaniment of Keith Fullerton Whitman's music. What was going on conceptually in that collaboration and how do you feel it worked out? S: Keith Fullterton Whitman's music was very interesting for performing our kind visuals to because it is totally improvised and completely analog (no bullshit!) and I think that each show is really different depending of what happens within the space. That's also how we see our work: emerging from a particular moment. When we started the show we were supposed to be receiving a live video feed from Keith but that never happened. Keith later said "I saw that you guys didn't need it [the feed]." I think that Keith's music really inspired us and that we worked together to create an experimental journey into an analog world, both musically and visually. Z: We wanted to have a prominent analog feel for this particular show seeing given Whitman's signal flow. Using our camera to capture audio-responsive glitches generated by a cathode ray tube TV as well as not clearing the depth buffer in Unity went quite far in giving us some nice feedback effects. Zef & Santo | Zef | Santo MUTEK | See also: Just another day at the lab: MUTEK A/Visions 2012 SAT See also: Zef & Santo's visuals for Pole, from an April SAT […]
- Strata #3 – Bordeaux [Processing] The Strata project by Quayola consist of a series of films, prints and installations investigating improbable relationships between contemporary digital aesthetics and icons of classical art and architecture. A combination of 3d render/processing application, the film includes Processing delaunay triangulation conversion applied to photographic stills with 3D mesh output which was then fed into 3D software application and rendered separately (video below). The term Strata defines a geological formation made of multiple layers of rock. Each one of these layers has its own individual characteristics and history, which combined produce beautiful and unique formations… Commissioned by Evento and Lumin for the Bordeaux Art & Architecture Biennale. Credits Concept: Quayola Sound: Plaid Photography: James Medcraft 3D Animation: Robin Lawrie Custom Software: Mauritius Seeger Assistants: Kieran Gee-Finch, Colin Johnson See also: Strata #2 - Paris Strata #1 - […]
- Face Pong [Processing] It's just like PONG, but with your face. Matt writes: I put this little game together to play with the idea of face tracking as an input device. I'm using the OpenCV library for processing with all sounds made in CFXR. I'm mapping the centre of the largest face to be the paddle. But I'm also smoothing it out over a few frames because of the openCV jitter. It's a really weird immersive feeling. After a minute or so you really get the sense that the paddle is an extension of you. Very cool, very weird. Matt Ditton does a number of different things. Programming, photography, indie games development, tech art, environment modeler, university lecturer, DVA student. To find out more, see thequietvoid.com and his flickr here Face […]
Posted on: 27/04/2010
Posted in: Processing
- Freelance Interactive Producers at Psyop
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- 3D Technologist at INDG
- Creative Director at INDG
- Lead Developer at INDG
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific