Created by Avoka Production, DIFLUXE allows us to observe and interact with a world of living beings. On screen, the particles wander forming groups similar to schools of fish. The visitor is invited to upset the balance of this microcosm by placing red or blue plexiglass disks on the surface of the screen.
The installation visitors reveal the nature of invisible forces. Half of the particles are desperate to join the red discs while avoiding the blue while the rest of them have the opposite behavior. Visitors can observe the chaos generated by their actions in the ceaseless movements of particles to find an equilibrium state.
The purpose of this installation is to experience duality in a living system, certain situations create confusion such that the particles never recover their balance. Their behaviors are also based on human nature and our permanent dissatisfaction. Each particle is endowed with free will that evolves over time, they are attracted to blue, red or completely independent.
Hardware : MacMini, Xbox webcam, LCD 42\” screen, plexiglas discs and screen border.
Software : Custom software written in c++ with the Cinder library, including OpenCV, OSC, and a link to SuperCollider. Supercollider patch controlled by the main c++ program via OSC to generate the interactive audio soundtrack.
More info: avoka.fr
- The Company [Cinder] "The Company" is the latest project by Andrea Cuius and Roland Ellis commissioned by Bring To Light Festival NYC. A suspended surface of 76 tungsten lamps form a catenary arch, playing host to live performances and revisiting the sounds of the 19th century East River industrial icons. The piece intends to bring back an atmosphere informed by the architectural legacy, a machine being delivered to occupy the space that was once a bustling industrial environment. By either producing sounds or just reactive to the inputs from the environment, The Company is a sound reactive light installation. The software was developed in Cinder and creates audio sampled in real time. The sound analysis is computed with Ableton Live using a Max For Live patch developed by Henrik Ekeus, it performs the Fast Fourier Transform, beat detection, attack detection and sound filtering, communicating with the custom software through an OSC connection. Andrea Cuius is a creative coder born in Italy now living in London, he works with different technologies to create connections between objects, audiences and environments. He engages the audience with data sampled from the environment to create immersive and provoking experiences. Andrea has been working with some of the most prestigious companies and collectives in the UK such as rAndom International, United Visual Artists and Cinimod Studio, contributing to develop large scale art installations and architectural projects. Project Page See also: So.. I was at a party last night [Cinder] and Hyundai i40 Reveal [openFrameworks, […]
- First experiments with Leap Motion and Cinder After months of everyone sharing the Leap Motion demo video, the first Developer Kits are making their way into the hands of those that signed up early. Dofl Yun was one of the few to receive it last week, Leap Motion Dev Board (v.04) and with some help from Robert Hodgin and Andrew Bell with the setup, he has shared the early progress. For those that do not know, Leap Motion is a 3d depth camera much like Microsoft Kinect except provides much better precession. It aims to represent an entirely new way of interacting with your computers. It’s more accurate than a mouse, as reliable as a keyboard and more sensitive than a touchscreen. For the first time, you can control a computer in three dimensions with your natural hand and finger movements. Experiment I: Spaceship Racer Prototype This prototype shows that controlling position of camera in 3d space by tracking two hands. -- Experiment II: Mesh Builder This prototype shows that how to add points by tracking fingertips and generate triangle meshes in 3d space by connecting points. The company is estimating that the leap will ship in early 2013. Below experiments were built with C++ and Cinder. Reference Links Leap Motion: https://leapmotion.com Cinder: http://libcinder.org Here are links to download prototype apps for Mac (Of course you will need Leap Motion to test them). http://www.thedofl.com/data/spaceship_racer.zip http://www.thedofl.com/data/mesh_creator.zip Follow the progress on Dofl's […]
- 10 Most Exciting New Experiments with Leap Motion Once we take a step back from our screens and look at exciting new opportunities Leap Motion provides, we may discover and begin to describe new ways of computer-human interaction. Here are our top 10 […]
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- Growth II [Cinder] Growth II is the Cinder port of Processing app created by Job van der Zwan, an art student at Academie Minerva in Groningen. I have included both the Cinder version and the original Processing code for comparison. He does note that the quality with the video below isn't that great although you can still download the original source video from vimeo. Please also note this is all work in progress, with depth blur to come + more. You can follow the development on Cinder forum. Job writes: The app originally grew out of an idea to demonstrate the workings of evolution with a particle system, or "paintblobs". The idea was that the "brushstrokes" would evolve in color according to very simple fitness criteria, with red accounting for vitality, green for fertility and blue for genetic stability. Along the way I got distracted and added all kinds of simple aesthetic things, like smooth movement, a fade effect, the paintblobs growing as they live longer without being eaten by other blobs, etc. By now the original idea is all but invisible and it's just me tweaking some settings, resulting in different behaviour of the simulation as a whole. Cinder is beginning to gain momentum. I am constantly hearing from artists+developers they are playing with it. Although they are still keeping the experiments to themselves, expect a lot more Cinder projects appearing on the web shortly. As with any other new framework, the learning curve can be somewhat steep. If you are interested in playing with Cinder, check out this great tutorial by Robert Hodgin aka […]
- Woods – Structured landscape of responsive light by Nocte Created by Nocte, a collaborative between Andrea Cuius-Boscarello and Hannelore Leisek, Woods is a responsive light installation commissioned by artistic director Heather Eddington of State of Flux DanceFilm Company for their Samuel Beckett Theatre Trust Awards 2013 finalist performance A Study of Who. By using different lighting setups and dispositions for each consecutively revealed element, every scene of the choreography is accentuated in its various settings. The installation, comprising 30 unique handmade redwood anglepoise lamps with classic tungsten lightbulbs, is eventually emerging from the ground building a structured landscape of responsiveness and light, taking the spectator through the emotional and physical journey of the performer’s flowing display of grief. The degree of the hanging light bulbs and the crossed placing of the lamps in a curved position, directing the visual impression of the scenery, create an interplay between light and shade. The sequenced installation building the setting and following the motion of the story is providing a consistent spatial response for the viewer. The installation is controlled by a custom software developed with Cinder. It implements the effects to both pre-visualise the show on screen and output the DMX signal that controls the lamps. The system is designed with two connected computers using OSC. The main machine is a generic engine that executes custom effects. The other machine typically uses Ableton Live to pre-sequence and perform the show. In addition they are using a Max4Live module, designed by Henrik Ekeus, to analyse the audio in real time and send it back the FFT using OSC. Some of the effects are actually audio responsive; one effect connects each lamp to a specific frequency. Another effect utilizes the FFT to draw abstract outlines and trigger the lamps enclosed in its shape. The Ableton Live session view, transport and controls are reflected in their software and are used to trigger effects, tweak parameters and control the other elements of the show. For each effect they developed individual Max4Live modules to add custom parameters and presets. To eventually make a different use of the space, each effect has got its own behaviour. Some of those effects were specifically created on-site to enable us to assimilate them better within the venue. Others are based on sound and again others on 3D objects moving within the space to ultimately trigger the lights. The teams workflow additionally relied on 3D softwares like Cinema 4D which they used to sketch ideas, finalise designs and to export data for the custom software to reconstruct the scene. It is also used as a reference and to export other objects used by the effects. Part of the code team developed is also available on Github: Ableton Live Cinder block, DMX Usb Pro Cinder block, Ableton Live OSX phyton script Project […]
- Silent [C++, Cinder] Created by Chandler McWilliams, Silent is a two minute video made by combining frames from five classic silent films: Metropolis, Faust, Nosferatu, Holy Mountain, and The Dragon Painter and put to the music of Charles Ives’ Hallowe’en. The frames are chosen by custom software that compares data from each of the film’s soundtracks with the data from Ives’ music. Made using Cinder, a C++ framework, custom software analyzes each film and records the audio (FFT) data and timecode for each frame. The final video is generated by processing an input soundtrack, in this case Hallowe’en, and finding the frames of film whose audio best fits that of the soundtrack. Silent films were chosen as the source material because of their tight connection between narrative, visuals, and musical score. By using the soundtrack as the central driver of visual imagery, Silent inverts these relationships. This reversal allows forms typically associated with music-repetition, rhythm, movement-to be express themselves visually. Cinder is a C++ framework developed by the Barbarian Group. For more information including Chandler's other projects see brysonian.com See also Cymatic Ripple [C++, […]
- Dazzled [Cinder, MaxMSP] Created by David Dalmazzo, Dazzled Project is an attempt to compose a generative particle environment that could at the same time create structures and sounds. The application uses both MaxMSP and Cinder via OSC bridge allowing sounds generated from max be fed directly into Cinder app which generates the visuals. I would like to program patterns and physics simulations with the aim to compose music structures that has a direct representation on a formal shape. One of the influences for this project was some examples that Robert Hodgin like Solar Rework. But in this case the idea is not to have a sound reactive visuals, but visuals that create a generative sound and music compositions. David writes that the videos below are just the first part of the project. He is also planning to add rhythmic patterns based on constant rebounds or elastic connections between particles. Dazzled Project was supported by Generalitat de Catalunya. Project Page David Dalmazzo is a musician and digital visual artist oriented to interactive audiovisual composition. Focus on live performance and dedicated to the investigation on informatics tools that contribute narrative and composite elements to the scenic arts. See also INSCT [vvvv] by @timpernagel and audionerve.de […]
Posted on: 23/03/2012
Posted in: Cinder
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google
- Web Designer and Developer at the School of Visual Arts
- Creative Front-end Developer at DelighteX GmbH