Building on their previous installation at Coffee Kitchen, Kimchi and Chips created ‘Link’, their latest interactive installation for Design Korea 2010 where people record their stories into a cityscape of cardboard boxes.
Link was created for the event as an interpretation of ‘Convergence’, the theme of the exhibition. The team presented a convergence of complex, fast moving technologies with low, everyday materials. Furthermore, the audience is invited to take part and “can store their memories inside boxes”.
The installation includes a number of different components. To match the projections to the boxes, the team developed iPad mapping application allowing users to interactively match projected images in the room. App was built using openFrameworks and libmysql (see video demo at the bottom of the post). The iPad interface, allowing users to add information was also built using built using openFrameworks with 2-way communication over OSC. Main mapping playback was created using VVVV with custom plugins for threaded video playback / recording (up to 80 videos playing simultaneously whilst 2 videos being recorded), MySQL for database and a total of around 3000 recordings were taken during the exhibition. The team also used Adobe Flash for designing animations.
Hardware included 3 servers, each Core i7 Quad core (8 threads), Nvidia Geforce 460 GTX, 8GB RAM (for caching video playback), Triplehead2Go x 2, 2xPlaystation eye and 6 x 3000lm projector.
See video below including the making of at the bottom of the post.
Kimchi and Chips are a cross-disciplinary art & design studio based in London and Seoul. They create installations, products and services that bridge the gap between people and people, people and technology, people and nature. They are Elliot Woods, media artist, technical designer and Mimi Son, user-centred interaction designer and visual artist.
- Voyagers [openFrameworks] Created by The Light Surgeons for the National Maritime Museum in London, the installation "Voyagers" engages with England's long standing relationship to the sea, featuring thematic images and film from the museum's collection animated atop a continually flowing ocean of typography across an abstract wave shaped structure. Together with a number of other projects, the installation opens to the public tomorrow. We got a chance to take a sneak peak earlier today and get some insight into the making together with what we enjoy most - the debug info and some fantastic behind the scene images. James George from the New York studio Flightphase collaborated with TThe Light Surgeons to create custom application to animate the content in realtime. Created using openFrameworks, the applications use a number of different tools to communicate the narratives. The ocean effect of type sweeping across the installation surface is a 3d wave simulation created using a vector field. The complete simulation is stitched and mapped across seven projectors covering the 20 metre triangulated surface. The image sets were designed by the Light Surgeons to relate each of the six themes of the museum. openFrameworks parses the layouts and generates animations that cascade down the wave. Also, at the far end of the gallery is a Puffersphere, which is an internal spherical projector. During the course of each cascade of images the puffersphere collects thematic keywords that relate to the images and prints them onto the surface of the globe. Likewise, the type waves trigger projected content of the sphere as they "hit" it's surface. The audio created by Jude Greenaway is mixed dynamically by interfacing openFrameworks to SuperCollider over OSC. James used Dan Shiffman's Most Pixels Ever library for synchronizing the application. He has also released a number of changes to the library that can be found here (github). The team has also built a way to synchronize parameters over the network using MPE - github and through developing content for the Puffersphere, the team created a lightweight library for animating the surface of the sphere and can be found here. Full credits: Design/Direction: The Light Surgeons, Bespoke Software Design: Flightphase, Sound Design: Jude Greenaway, Additional Programming: Timothy Gfrerer, SuperCollider Programming: Michael McCrea and Exhibition Design: Real Studios National Maritime Museum the image sets the type wave sweep Debug mode Debug mode showing oF app UI oF App wave simulation Keyframe animator Animation […]
- Hyundai i40 Reveal [openFrameworks, Flash] Created by Hi-ReS! and Nanika is a project for the new Hyundai i40, which was revealed at the Geneva Motorshow on the 1 March. Starting on Friday, 25 Feb, users could connect through Facebook (or not) to directly participate in the live light reveal. The web application built in Flash by Theo Tillberg, Mike Tucker and others allows users to animate light over the car and change views. This is of course sent directly to the warehouse where the car is located and video feedback shown back to the users on the site. In a nutshell, the software written in openFrameworks by Andreas Müller was about finding a way to project a piece of video onto an arbitrary shape of lights. For the earlier project for John Lewis, Andreas hardcoded in 4 rectangles around the model of the house, something similar that was done for Ars Electronica by YesYesNo, but in the new system Andreas represents the surface in 3D space, use that as a view and then get the lights positions within the 2D surface by looking them up in an ortographic projection. Once you have the 2D positions of the lights within the surface, the brightness is calculated based on their 3D position. For fun, Andreas also hooked up OSC control using TouchOSC iPad app allowing him to create swarm like light effect with the setup in real time (see video below). The official site for the launch was http://www.hyundai-i40.eu also see hi-res.net and nanikawa.com […]
- Prismatica – Kit Webster uses a 3D crystal surface as a lens "Prismatica is an extension of the visual and perceptual experimentations in my Enigmatica series," Kit Webster writes about his newest installation. Known for his interesting approach to projection mapping (read more about the ingenius Enigmatica series here) the Australian media artist specialises in wrapping things in light. With Prismatica however he turned the technique on its head: instead of mapping animated visuals onto a 3D surface, Webster uses a 3D surface as a lens. A formidable formation of clear, pyramid-shaped crystals is affixed to an LCD screen displaying abstract, geometric animations that are "precisely mappend to the vertices of the crystals, illuminating them indiviually and in formation". The resulting psychedelic kaleidoscope effect is a dynamic interplay of prismatic refractions through the geometry of the crystals, the reflections of surrounding lights and the shifting perspective of the observer. Very effective, very Kit Webster! A quick Q&A with the artist revealed some curious details – and that Prismatica will only get more interesting: Are the geometric animations illuminating the crystals generated in real-time? Right now the piece uses a pre-rendered video file running from a media player in order to illuminate the crystals. I am currently designing a new version that uses real-time graphics. Which tools did you use for creating the animations? The animations are created in Flash, After Effects and Premiere Pro. Real-time self generative animations are currently being tested in vvvv, Processing and openFrameworks and could swing in any direction at this stage. What kind of crystals did you use? Are they custom-made? The screen lens is made up of K9 crystal pyramids, arranged in formation and affixed to the screen. A new unit is in development that is made up of a larger variety of crystal forms allowing for more complex and intricate geometric arrangements. The animations will adapt to these changes accordingly. How did you tune the dynamics between the animations and the crystal refractions? The piece went through a testing process of approximately two months. During this process various animated sequences were fed through the crystals to assess their suitability. I eventually found that the most interesting patterns to use were ones that mimic the displacement effect of the animations reflecting through crystals, a bit like a brief reflective feedback system. Follow Kit Webster here: kitwebster.com | Vimeo | Twitter [Video below: sketch depicting the upcoming version of Prismatica utilizing a series of individually cut crystal […]
- sound:frame Festival Lightrails [vvvv] The interactive audio-visual installation, ‘Lightrails’ is a project Vienna's Strukt created together with unheilbar architektur for the Project Space inside the Kunsthalle Wien. 'Lightrails’ is a light sculpture with the intention to re-define and re-interpret the exhibition room. An easy but effective mapping technique was used to create seamless projections on both sides of the object. Light-beams were triggered by the visitors and ran through the room, following the surface created by the sculpture. Each “reflection” of the light-beam was accompanied by sound. The speed and brightness of the beam was directly influenced by the force the visitors used when triggering the beam stepping on pedals on the floor. This also influenced the volume of the sound-effects. The audio signal was played back on a surround sound system that allowed a spatial positioning of the sounds and created a truly immersive experience.' The sound-design was courtesy of Digitalofen Audiobakery, which also created the ambient sounds that were pervasive through the room. Strukt's favorite vvvv, real-time multipurpose toolkit was used to create the […]
- CAN at Flashbelt 2010 + Contest [Events] For those that might not already be aware, I'll be speaking at this year's Flashbelt conference in Minneapolis, MN, US June 13-16, 2010. Flashbelt is an annual conference focusing around Adobe's Flash technology. The attendance limited to 400 people in an attempt keep it personable. The team brings in about 40 great speakers from Europe and North America to give sessions along 3 tracks; Design, Develop, and Engage. Several sessions tend to deal with all three aspects but some are more technical in nature and some are more theoretical. Flashbelt is really about bringing people together to share ideas and find inspiration to make great work; to explore and innovate. Things also don't just evolve around Flash but you'll hear and learn a lot about Information Visualization, Physical Interfaces, Arduino and other tools such as openFrameworks. My session is called (as you'd expect): Apps That Inspire.. Will be discussing apps, devices, user experience and digital ecosystems via a selection of projects published on CreativeApplications.Net. From desktop, web to mobile and physical, this talk will be about the role creative applications play in both the physical and online domain. Why we should challenge the norm, question our relationship to technology and always try discover new ways to engage with information. Besides of course this talk you wouldn't want to miss :) here are few other things particularly interesting: - 4 great workshops - including Developing Apps for the iPhone - Flash AR workshops - Hype Framework talk by Branden Hall - Jer Thorp talk about Hypercard - Keith Peters on programming art - Joshua Noble on openFrameworks, Arduino and physical interfaces + so much more!! Note that registration are moving faster than ever and seats a limited so registering early ensures you'll get a seat - only 400 seats total, more than half of the seats are already taken. If you are a student there are special discounts. Contest Thanks to Flashbelt organisers, we have a ticket to giveaway to one of our readers, worth $658 US!!!. This includes a ticket for both the conference and a workshop. All you have to do is tweet the message about this post by clicking on the link below. One lucky winner will be chosen this Wednesday 11pm GMT. Rules and information 1. The event is being held in Minneapolis, MN, USA in June 13-16, 2010. You will need to arrange your own travel and accommodation (not included). If you win the ticket and you can't make it to the event, we would appreciate you let us know so we can give the ticket to someone else. 2. Competition is open to everyone and anyone but you must be over 18 years of age. There will be a total of ONE winner for this competition. 3. Winner will be selected by random. 4. Winner will be contacted via email and will be asked to provide their full name and postal address. If they wish to pass on a ticket to another person, we will need their name and postal address. If the winner does not respond by the following Friday (23rd April) we will pick another winner. Good […]
- The Space Beyond Me [openFrameworks, Arduino, Processing] The Arriflex 16 ST body with UV-light source and motorized zoom lens Julius von Bismarck and Andreas Schmelas have just open sourced the code of their collaboration project "The Space beyond me". The project includes an "Apparatus for reviving spaces that are captured in celluloid" and was exhibited at the Transmediale 2010 (Berlin) and several other festivals (right now it can be seen at the Ghent Film Festival in Belgium). The installation is able to construct a representation from celluloid film combining a modified 16mm camera with a UV-light projector. The device projects a film whilst moving in exactly the same way in which the camera operator moved the camera while shooting the film. What happens, if a projector moves while it is projecting in exactly the same way in which the camera moved that recorded the film, which is now being projected? What happens, is similar to processes happening in the brain when we perceive our surroundings. Virtual rooms or landscapes are composed from flat visual information, constructing a subjective representation of the world. The projector is placed centrally in a round room, the walls of which are painted with phosphorescent paint. The paint emits an afterglow of the image projected onto it, so that the moving camera-projector keeps adding to the image. After the film has played, all scenes of the film are reproduced in their correct location. The film, which originally recorded a spatial setting, has been translated from a time-based medium back into a space. The software for the installation (available to download here) consists of several parts and including a openframeworks scene arrangement application, arduino sourcecode and a processing app (responsible for parsing the output of the openframeworks into a arduino compatible progmem format array). The openframeworks part includes an application for extracting "camera movement" out of a video and an application for arranging "scenes" onto a virtual stage. Project pages juliusvonbismarck.com | andreas-schmelas.de The Space Beyond Me from fenomenologie on Vimeo. »The Space Beyond Me« still, Transmediale 2010 Various drafts by […]
- EYJAFJALLAJÖKULL [vvvv, Events, Environment, Inspiration] Last year, onedotzero approached Joanie Lemercier of AntiVJ to be part of one of their event, a festival they organised at empac, upstate New York, with a selection of screenings, installations and live performances. The installation, now on show at the China Millennium Monument Museum of Digital Arts onedotzero event, has received fantastic feedback and high acclaim. Just as the team published video describing the project (above), we asked Joanie few questions about the installation. Joanie: The original plan was to fly to empac to do a 3 weeks residency, to develop a new project from scratch, which would involve projection mapping onto objects, a sound track by minimal techno producer Sleeparchive, and potentially a live performance on the day of the opening. As the video explains, the original idea and schedule felt apart when the volcano erupted, and the 3 weeks long residency turned into just 5 days on site, to setup the installation and prepare a live performance... 2D/3D mapping Considering the constraints, it wasn't realistic anymore to plan any 3D mapping, or project onto a complex structure. The tools and workflow Joanie was using at the time, wouldn't have worked with only a few days for production, so he decided to use a different technique: a projection mapping, but this time on a flat surface, an idea that is similar to an experiment he did in Bristol back in december 2008 for a gig with dubstep producer Shackleton (video). The idea here was to project a layer of light onto a painted visual, and use this "virtual layer" to create depth effects and enhance the visual by adding colors, animations and motion to the still graffiti. He enjoyed the challenge of using the production tricks he learnt from architectural mapping projects, and play around with the audience visual perception, to make this 2D visual appear as if it was an actual three dimensional structure. Technically, Joanie explains, this process is almost like "reverse mapping", as all the production can be done on a computer, without worrying about projector alignment, and he can just trace / draw over a projection of the still image he designed. This, compared to a complex 3D mapping such as a baroque architecture project (video), is a total relief. Minimal vs Organic.. Joanie has been obsessed by geometry and minimalism for years, and most of his work has been very clinical, cold, and more abstract than figurative or realistic, and he wanted to start working with more organic shapes, and start using curves, less angular patterns. Being fascinated by the relationship between maths, geometry and nature, he wanted to explore that idea in his work, and incorporate some visual elements that would connect geometric patterns and ocean waves, terrain, mountains relief, wind, snow and rain motion. He started off with a simple grid with a 100x100 resolution, in a x-y-z space, and he set a rule of only moving theses points onto the Y axis to generate a series of 3d models, as a starting point to find inspiration for the project. He started manually, but to make the patterns more interesting, he starting using different kind of noises: perlin noise (link), simplex noise, turbulence and voronoi, He could then turn his grid into landscapes, ocean, dunes, mountains, by only playing around with the noise values: scale, offset, contrast.. After countless hours of random experiments, and render with different types of shading, He then focused on the perspective and and the choice of the field of view he would use. He did experiments with isometric, birds eye view, with or without foreshortening, and he ended up using a One-point perspective and a 50mm virtual lens, to guide the audience sight to the center of the piece. The Volcano.. Once the technique was chosen, and played enough with the models, camera, shading, Joanie was so obsessed by the volcano that ruined the residency ( link ), that the narratives and the content for the piece appeared to be quite obvious at that point: the main visual of the piece had to be the volcano, so he modeled a mountain like landscape and got inspired by pictures and videos of the Eyjafjallajökull, and the animations were going to be its story: The early seismic waves that were recorded at Eyjafjöll at the end of 2009 and gradually increased in intensity until on 20 March 2010, the first small eruption(rated as a 1 on the Volcanic Explosivity Index. Beginning on 14 April 2010, the eruption entered a second phase and created an ash cloud that led to the closure of most of Europe's IFR airspace from 15 until 20 April 2010. Consequently, a very high proportion of flights within, to, and from Europe were cancelled, creating the highest level of air travel disruption since the Second World War. (source wikipedia). Joanie describes the end of the piece as a bit more abstract and futuristic, with waves of light going through the wireframe soil and the volcano. Unfortunately Sleeparchive couldn't come due to the flight cancellations + Visa problems, so he ended up working with field recordings of the eruption, and an beautiful track from Robert henke (monolake). At the end, the project is a dual 1920x1080 projection, running at 30fps and produced with cinema4D. The piece has then been edited with Vegas 10, encoded in a GPU friendly codec and screened through a custom patch made with vvvv. The Future.. Considering the constraints, the project was developed only under a few days, and even if it has evolved since then. Joanie describes the next steps to screen the piece at 60 frames per second, and ultimately at 120fps, to reinforce the realistic aspect of the projection. He is slowly moving from classic production to realtime tools, and the idea is to turn the piece into a full realtime patch (with physics and multiple point light shaders), "so it might then be possible to screen a new version of the piece actually being mapped onto a real erupting volcano, when the right projectors becomes available". =] Thanks Joanie. EYJAFJALLAJÖKULL is currently on display at China Millennium Monument Museum of Digital Arts - curated by onedotzero. Project Page | AntiVJ | onedotzerox Previously on CAN: AntiVJ [Profile, openFrameworks, Processing] - "projected […]
- AI Controller [iPad] Created by Aircord lab team, same group behind the mobile runner app we wrote about few months back, AI Controller is an iPad application designed to control Box2D physics engine projected onto a building. Controlled by OSC to a openFrameworks desktop application projecting the image, windows layout is mapped to the iPad oF application where you can drag the particles and adjust colour of the projected image. See movie below for demo... (Thanks […]
Posted on: 18/01/2011
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG