Leave a comment

Nike Flyknit ‘Fit’ Installation by Universal Everything / Milan Design Week 2013

universal_everything_nike_milan_14 copy


In 2012 Universal Everything were invited to help celebrate the release of Nike‘s special edition Flyknit collections at the design week in Milan. The installation, as seen here, is comprised of a 4 sided video cube capturing the hive of activity where living threads swarm across the screens responding to human presence. As you enter you hear the hum of a factory process, sounds derived from macro samples of threads, interlocking repetitions and peaks of brightness causing visual reactions around the space.

As with all other works by Universal Everything which thrive on collaboration, this project is the work of Matt Pyke, Dylan Griffith, Chris Perry, Mike Tucker, Andreas Müller and Simon Pyke, produced by Keri Elmsly and Captain Blyth. The concept of a cube that sits at the center of the room came very early in the project when the team first visited the space in Milan. Each side of the cube acts as a mirror with a distinct pattern that is transposed onto the visitor’s silhouette.


In the beginning of the project the team used Houdini to quickly prototype various visual styles which were eventually used to create short film various Nike stores and the poster. For the installation, they agreed to use kinects and openFrameworks to read movement and posture of visitors in the space and since skeletons are unreliable in large groups, they instead chose to use silhouettes created by the depth map. UE stack on site included OpenFrameworks 0.7.4 with ofxOpenNI to integrate with Kinect. They made an initial particle system class with a variety of parameters with fairly simple set of rules – particles would be spawned in the silhouette, or move towards a point in the silhouette; particles would live for a certain amount of time and velocity would be influenced by perlin noise and the silhouette.

Early Houdini sketches

Final openFrameworks render

Houdini render for print

Since they would be using Mac Minis with modest Intel 4000 GPUs, they opted for a graphically simple aesthetic, using only basic GL_LINES to draw each particle. Surprisingly the CPUs, Mike Tucker explains, were fast so they were able to get around 10k particles simulating per box with little optimisation required. They gradually added more rules to the particles for each look, each new rule added a bit of personality to the particles. For instance, one look had a cross-hatching pattern that would emerge over the silhouette. Another look had frantic particles targeting random spots in the silhoutte. With ofxCv, Andreas Muller implemented optical flow using the depth camera, the RGB camera was virtually useless because of the space being in a dark room. The data was not ideal, but useable for some of the looks in conjunction with the perlin noise. By tweaking parameters in the prototype, they were able to establish some distinct looks for each side of the cube.

Each established look grew into individual classe, they established color schemes, played with blending and gradients, and added a few more tweaks to the behaviours. They scanned the music for “bang” moments and managed those with ofxTimeline. Each side of the cube ran on a standard core i7 Mac Mini and 1920×1200 projector. One machine acted as a host, driving the music and sending OSC messages for the “bang” moments to the other 3 machines. There were a few hardware concerns: networking and USB extenders for the Kinects, and even one crash bug on launch day, but everything went smoothly in the end. LogMeIn was loaded onto each machine so that we can remotely SSH or share desktop from London and simple OSX Daemon was used to monitor that the app would run on startup and restart automatically after a crash. Nike had a fabrication team for the show, and the team delivered their cube designs to be built.


Simon Pyke (long time UE collaboration) wrote a soundtrack for the piece which was very minimal with distinct peaks every 10-15 seconds. The visuals react to music and ramp up during the peak moments, creating more abstract versions of the existing look.

This exhibition will evolve overtime and will travel to New York, Tokyo and London through October.

Project Page


Interactive Piece: Concept and Creative Direction – Matt Pyke; Art Direction – Dylan Griffith; 3D Animation – Chris Perry; Developers – Mike Tucker, Andreas Müller; Sound Design – Simon Pyke; Executive Producer – Keri Elmsly; Producer – Captain Blyth; Film and Photography – James Medcraft

Print & Animation Pieces: Concept and Creative Direction – Matt Pyke; Art Direction – Dylan Griffith; 3D Animation – Chris Perry; Sound Design – Simon Pyke; Executive Producer – Keri Elmsly; Producer – Captain Blyth