Created by Andreas Müller, the lamp North is based around the simple idea that wherever you are in the world, there is always another place out there that is important to you and you have a relationship with. The lamp visualises this relationship by giving out more light the more directly it points towards this location, the (magnetic) north pole in the case in the case of this lamp but idea is that a future version will allow you to specify the lat/long coordinates of a place in the world, where a loved one lives or the place you were born, to use as it’s point of reference.
More details + video below.
Technically it’s based around the Arduino Nano, and a CMPS03 Compass Module.
Andreas first started off trying to dim a normal 250V lightbulb with a Velleman K8064 dimmer circuit, which worked well light wise, but was a bit dangerous. Andreas says that he could have been imagining it, but he thought he could feel the air in the room change whenever the circuit was running.
Self preservation drove him to re-design the circuit around 12V leds (ultraleds.co.uk is a great place for those) and ended up with a circuit you don’t have to be quite as careful around (video below).
The actual shape of the lamp was laser cut out of perspex and glued together.
More wonderful work by Andreas is available @ nanikawa.com
- Patch of Sky – Lamps that share, in real-time, the sky above us Created at FABRICA, Patch of Sky is a set of three Internet connected ambient lamps that share, in real-time, the sky above […]
- Monolith [vvvv, Objects, Arduino] 'Monolith' is the latest project from the London based design studio Signal | Noise. The team collaborated with the Swiss design studio Unit for the french luxury label Hermés, and their new flagship store in Geneva. The theme for the evening was the meeting of handcraft and technology and in the first room they created an iPad application which invited guests to leave their hand print on the evening, wheres the second installation, shown here, included a six metre interactive object that allowed visitors to control strips of light passing through it. The so called "Monolith" was interwoven with "digital stitches" - arrays of infra-red sensors and LEDs, which allowed guests to create and control strips of light in the minimal, high-gloss surface. The structure is made of timber frame, routed high gloss MDF panels, acrylic strips, LED strips, IR transmissive plastic and custom circuit boards. The custom application made in vvvv by Gareth Griffiths communicates with the LED strips using Arduino boards. The Arduino boards were programmed by Dom Robson to send and receive binary messages which are decoded using a combination of vvvv nodes and a custom plugin called ShiftData made by Vux. The on and off touch signals are sent to the LED control patch where the data is analysed and sent back to the Arduino controlling individual brightness of the LED. See vvvv patch images below with further description of the process. vvvv Patch: Gareth Griffiths / Uberact Hardware Design and Programming: Dominic Robson Project Page | Unit | Signal / […]
- KIHOU – A bowl of liquid light Design studio tangent: fixes LEDs and a pump to bowl of sticky liquids, creating a brew of blinking […]
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- lumiBots [Arduino, Objects] [Photo: S.T. Heizmann] What looks like a time-lapse recording of bioluminescent critters roaming the deep sea floor is in fact a swarm of 9 autonomous UV light emitting robots inhabiting a 1 x 2 meter phosphorescent surface. Created by Mey Lean Kronemann, a Berlin-based media artist with an interest in robotics, these lumiBots (2010-2011) tirelessly trace the fading trails of their peers. An endless pursuit that, much like a computational drawing machine, generates glowing patterns of visual complexity out of a simple system. Each lumiBot, designed to be as inexpensive and basic as possible, is equipped with an Arduino micro-controller, two light sensors, two click switches for collision detection and a UV LED that activates the glow-in-the-dark sheet. Its movement – not pre-programmed nor predictable – is based on two simple rules: follow the light (the brighter the better) and turn after collision. This efficient little set-up can trigger interesting results and surprisingly emotional reactions. Exhibited at a number of international festivals, delighted audiences saw lumiBots not only follow existing paths, but refine them, take short-cuts or wander off exploring. "People connect with the lumiBots right away," says Mey in a Skype chat. "Their movement suggests life, life suggests emotions." Easily confused by the light of an opening door, a bright iPhone screen or a camera flash, lumiBots will stir as if alarmed. "Maybe it's their helplessness that makes them so likeable. People find them cute, talk to them and even make out individuals. One might appear to be thinking, another one comes off stubborn, two others seem to feel attached to one another." And really, every now and then two lumiBots engage in a spinning dance, or inseparably continue their journey together after a rough collision. Mey's fascination with emergence, swarms, and artificial life forms was already evident in her 2006 interactive floor projection schüchterne lichter (timid lights) and it continues to spawn. Her newest species: Klackerlaken (clanking bugs), a swarm of buzzing and glowing insect-like vibrobots made of a cellphone motor, an LED and a battery, all taped to a bottle cap. Developed for a maker workshop for kids at Lab30's Kunstlabor event in Augsburg (October 2011), Mey's Klackerlaken will also infest Berlin's c-base as part of the Transmediale satellite Dorkbot event on January 30th. Go catch some! See more of Mey's work on her website and follow her on Twitter @lumibots. See also What is at stake in animate design? [Theory] and how to make […]
- Type Case [Processing] Created by Martin Bircher, "Type Case" is a low-resolution display embedded in an old printers' type case with 125 rectangular pixels of different sizes. These are formed from the reflecting light of Processing controlled LEDs, embedded in each section of the case. Images are converted and fed to the case via arduino. Due to the standardized fragmentation of its compartments, the density of visual information is decreased towards the objects' centre. Viewed close by, it is nearly impossible to recognize more than a flicker – however after moving some distance away, it becomes distinguishable, that the lights and shadows give a representation of the latest headlines. The Processing script is fetching an RSS feed of an online newspapers headlines. It stores the title and description of the ten latest entries into a variable and rendered in the serif font Caslon, which is scrolled horizontally on the top of the window. Every frame the pixels of the middle section of the scrolling text are stored in an array (1600 pixels) and displayed on the left. These pixels are then reduced to an array of 20 x 10 pixels (also on the left of the window). Since some of the compartments are the doubled or quadrupled size of the smallest units, these pixels have to be reduced again. On the biggest part of the window there is the simulation, which represents the gray scale values of each pixel of Type Case. Project Page Martin Bircher, born 1978 in Aarau, Switzerland, served his apprenticeship as an electrician and studied Fine Arts. In 2006 he migrated to Kouvola, Finland, where he works as a media artist and a full-time lecturer for Digital Media. The interest in his artistic work lies in combining antiquated items and cutting-edge technology in order to create objects with new purposes. (Cartes) See also on CAN: Parallel Image [Environment, Objects] The Stealth Project [Environment, Objects] Swarm Light [Inspiration, […]
- New Angles [Objects] 'Seeing things from different angles, changing the perception of what we are seeing' - New Angles is an interactive installation reflecting the juxtaposition of subversive thinking and visual perception. The project was created by SuperNature, the knowed Shanghai based multi-discipline design company which is specialized in interactive design, visual communication & media technology. New Angles consists of 420 prisms made out of white acrylic. Each of these prisms will transmit the RGB light (LED Point Light Source) and form a picture element (pixel) based on the captured image by a camera. Through such process, the visual formed on the prisms creates dialogues between imagination & reality, present & future… The camera is activated when a viewer approaches the installation. And if the viewer is away, a series of lighting animation sequence will be played back. Project specifications: The light source comprises 420 Point Source LEDs (ø8mm RGB) and arranged in zig-zag format in order to achieve the displacement between rows. The communication between the LED control box and computer uses DMX512/1990 protocol. A customized software was written to perform 2 tasks. The first task of the software is to grab the image of the display in real-time within a specific region, and remap the pixel data according to the 420-pixel zig-zag arrangement. The second task of the software is to output the grabbed pixel data to the LED control box via DMX protocol. The swapping of contents between a series of animation sequence and video live feed is made possible using an IR sensor and Arduino. Dimension of the prism: 80mm*80mm*80mm Size of the setup: 1380mm(H)*1200mm(W)*120mm(D) Project Page See also Me Wonderland […]
- Captured [Processing, Arduino] "Captured" is a temporary installation realised in May 2011 by Nils Völker and graphic designer Sven Völker at MADE Space in Berlin. The installation is comprised of four hanging walls with 304 framed graphic pages and a field of 252 inflatable silver cushions. Both artworks related to the theme of light and air and interact with each other in a twelve minute performance that also includes sound. Sven Völkers graphic work were his so called "books on walls" and narrated the installations four chapters of the intangible, the volume, the border and the ephemeral. Nils Völkers custom made inflatable air bags were programed by the artist to create sequences according to the chapters. He also controlled the existing multi-colour light system to intensify the dramaturgy and to create a close relationship between all elements. The setup consists of 252 modules, inflating cushions made from space blankets, that cover about 130 square meter on the floor. Inside each module there are eight cpu cooling fans inflating and deflating each bag in variable speeds. All together there are 2016 fans moving about 60 cubic meters of air. The whole set is controlled by a single Arduino board with shift registers attached to it to receive a total of 504 output pins. In this way every single bag could be controlled fully independently. In a addition, the exhibition space includes a pre-installed lighting system. It consists of 255 large lamps of which each is equipped with both a fluorescent lamp and rgb-leds. During the performance it was controlled by a program Nils wrote in Processing which was able to access each lamp individually. The Processing program did also take care of the sound playback and the timing for the Arduino programs. For more information see project pages: nilsvoelker.com svenvoelker.com made-blog.com Previously: Variations on PI [Processing, Objects] - LED machine used to ... One Hundred and Eight [Processing] - interactive Installation […]
Posted on: 08/08/2011
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG