“The Company” is the latest project by Andrea Cuius and Roland Ellis commissioned by Bring To Light Festival NYC. A suspended surface of 76 tungsten lamps form a catenary arch, playing host to live performances and revisiting the sounds of the 19th century East River industrial icons.
The piece intends to bring back an atmosphere informed by the architectural legacy, a machine being delivered to occupy the space that was once a bustling industrial environment. By either producing sounds or just reactive to the inputs from the environment, The Company is a sound reactive light installation.
The software was developed in Cinder and creates audio sampled in real time. The sound analysis is computed with Ableton Live using a Max For Live patch developed by Henrik Ekeus, it performs the Fast Fourier Transform, beat detection, attack detection and sound filtering, communicating with the custom software through an OSC connection.
Andrea Cuius is a creative coder born in Italy now living in London, he works with different technologies to create connections between objects, audiences and environments. He engages the audience with data sampled from the environment to create immersive and provoking experiences. Andrea has been working with some of the most prestigious companies and collectives in the UK such as rAndom International, United Visual Artists and Cinimod Studio, contributing to develop large scale art installations and architectural projects.
- Woods – Structured landscape of responsive light by Nocte Created by Nocte, a collaborative between Andrea Cuius-Boscarello and Hannelore Leisek, Woods is a responsive light installation commissioned by artistic director Heather Eddington of State of Flux DanceFilm Company for their Samuel Beckett Theatre Trust Awards 2013 finalist performance A Study of Who. By using different lighting setups and dispositions for each consecutively revealed element, every scene of the choreography is accentuated in its various settings. The installation, comprising 30 unique handmade redwood anglepoise lamps with classic tungsten lightbulbs, is eventually emerging from the ground building a structured landscape of responsiveness and light, taking the spectator through the emotional and physical journey of the performer’s flowing display of grief. The degree of the hanging light bulbs and the crossed placing of the lamps in a curved position, directing the visual impression of the scenery, create an interplay between light and shade. The sequenced installation building the setting and following the motion of the story is providing a consistent spatial response for the viewer. The installation is controlled by a custom software developed with Cinder. It implements the effects to both pre-visualise the show on screen and output the DMX signal that controls the lamps. The system is designed with two connected computers using OSC. The main machine is a generic engine that executes custom effects. The other machine typically uses Ableton Live to pre-sequence and perform the show. In addition they are using a Max4Live module, designed by Henrik Ekeus, to analyse the audio in real time and send it back the FFT using OSC. Some of the effects are actually audio responsive; one effect connects each lamp to a specific frequency. Another effect utilizes the FFT to draw abstract outlines and trigger the lamps enclosed in its shape. The Ableton Live session view, transport and controls are reflected in their software and are used to trigger effects, tweak parameters and control the other elements of the show. For each effect they developed individual Max4Live modules to add custom parameters and presets. To eventually make a different use of the space, each effect has got its own behaviour. Some of those effects were specifically created on-site to enable us to assimilate them better within the venue. Others are based on sound and again others on 3D objects moving within the space to ultimately trigger the lights. The teams workflow additionally relied on 3D softwares like Cinema 4D which they used to sketch ideas, finalise designs and to export data for the custom software to reconstruct the scene. It is also used as a reference and to export other objects used by the effects. Part of the code team developed is also available on Github: Ableton Live Cinder block, DMX Usb Pro Cinder block, Ableton Live OSX phyton script Project […]
- “Difluxe” by Avoka Production – Upsetting the balance of a microcosm / Cinder Created by Avoka Production, DIFLUXE allows us to observe and interact with a world of living beings. On screen, the particles wander forming groups similar to schools of fish. The visitor is invited to upset the balance of this microcosm by placing red or blue plexiglass disks on the surface of the screen. The installation visitors reveal the nature of invisible forces. Half of the particles are desperate to join the red discs while avoiding the blue while the rest of them have the opposite behavior. Visitors can observe the chaos generated by their actions in the ceaseless movements of particles to find an equilibrium state. The purpose of this installation is to experience duality in a living system, certain situations create confusion such that the particles never recover their balance. Their behaviors are also based on human nature and our permanent dissatisfaction. Each particle is endowed with free will that evolves over time, they are attracted to blue, red or completely independent. Hardware : MacMini, Xbox webcam, LCD 42\" screen, plexiglas discs and screen border. Software : Custom software written in c++ with the Cinder library, including OpenCV, OSC, and a link to SuperCollider. Supercollider patch controlled by the main c++ program via OSC to generate the interactive audio soundtrack. More […]
- Dazzled [Cinder, MaxMSP] Created by David Dalmazzo, Dazzled Project is an attempt to compose a generative particle environment that could at the same time create structures and sounds. The application uses both MaxMSP and Cinder via OSC bridge allowing sounds generated from max be fed directly into Cinder app which generates the visuals. I would like to program patterns and physics simulations with the aim to compose music structures that has a direct representation on a formal shape. One of the influences for this project was some examples that Robert Hodgin like Solar Rework. But in this case the idea is not to have a sound reactive visuals, but visuals that create a generative sound and music compositions. David writes that the videos below are just the first part of the project. He is also planning to add rhythmic patterns based on constant rebounds or elastic connections between particles. Dazzled Project was supported by Generalitat de Catalunya. Project Page David Dalmazzo is a musician and digital visual artist oriented to interactive audiovisual composition. Focus on live performance and dedicated to the investigation on informatics tools that contribute narrative and composite elements to the scenic arts. See also INSCT [vvvv] by @timpernagel and audionerve.de […]
- Robert Henke ‘Lumière’ – Cutting the room with vectors and lasers As Robert Henke sets of on his tour with the new project Lumière, kicking off in NYC on the 10th May, we offer a little preview of what is to […]
- egregore [Pure Data, Sound] "Égrégore" means an energy produced by the desires of many individuals in a common goal. This is the starting point of this audiovisual performance that aims to exploit the group movement phenomenas. Complex and expressive behaviors are generated and controlled by a computer and transcribed in sound and image. A crowd of particles deploys itself, reorganizes, blends into living structures more or less coherent, evolving from a chaotic movement toward a cohesive group. This project is a continuation of chdh’s work on audiovisual instruments, but aims to radicalize the search" chdh is a collective founded in 2000 by Cyrille Henry, Nicolas Montgermont and Damien Henry. Their work explores behaviour, images and sound, in particular the possibilities offered by Pure Data and its graphics library Gem. See more screenshots here | chdh.free.fr supported by : Césaré - national center of musical creation, Reims : development residency. iMAL - center for digital cultures and technology, Bruxelles : development residency. (thanks […]
- Shedding Light on Squidsoup – A Conversation with Anthony Rowe For more than a decade, the artist collective Squidsoup have been designing rich interactive experiences. From their early navigable sonic environments, through their playful experiments with computer vision and interest in 'volumetric visualizations', an email exchange between Squidsoup's Anthony Rowe and CAN begat a mammoth interview abound light, sound and many of the collective's […]
- Tesseract / HyperCube – An ever-evolving state of shape and space Tesseract - also known as the HyperCube, is the latest installation by the Paris based collective 1024 Architecture comprised of a 3d sculptural cube equipped with moving light, where the audience is invited to walk […]
- Drawing Water [Cinder] Drawing Water by David Wicks is a constructed landscape created by the relationship between rainfall and water consumption in the United States. The application, made in Cinder uses water consumption data and builds images to expose the reality that water is channeled, pumped, and siphoned to locations far from where it falls. Although the paths are imagined, the project is based on real data and it attempts to reveal truth about water resources and use. Drawing Water plays a bit upon the 19th-century theory that “rain follows the plow.” At the time of its inception, that theory promoted Westward expansion, under the belief that plowing fields encouraged cloud formation and rainfall. As long as people plowed fields, they believed, water would come to them. Although we recognize climatological reality isn’t influenced by our farming (in the manner hoped), Americans still live with an illusion of resource availability following need. David will be showing the prints and interactive version of the application at UCLA as part of the D|MA thesis show May 12. Drawing Water uses water consumption data provided by the USGS and rainfall data provided by NOAA/NWS. The data is downloaded and parsed with a series of python scripts. For the interactive application, he wrote both a PC and an iPad application. The iPad has an overall map and timeline used to control where and when you are looking. They talk to each other using OSC, aided by oscpack and Hector's OSC Cinderblock. Project Page Work in progress video below looking at Los Angeles and the Desert […]
Posted on: 12/10/2011
Posted in: Cinder
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG