Created by Shanshan Zhou, Adam Ben-Dror and Joss Doggett, Animatronic Lamp is an exploration into the expressive and behavioural potentials of robotic computing. Using Processing, Arduino, and OpenCV, the Lamp is given an ability to be aware of its environment, and to expresses a dynamic range of behaviour.
As it negotiates it’s world, we the human audience can see that Lamp shares many traits possessed by animals, generating a range of emotional sympathies. In the end we may ask: Is Lamp only a lamp? – a useful machine? Perhaps we should put the book aside and meet a new friend.
The components include a hacked webcam, microphone, mechanical iris, 2 servos and halogen globe embedded into a tiny cavity at the back of the lamp shade. Most of the components were lasercut and the design of course inspired by the famous Anglepoise lamp by George Carwardin and the Pixar lamp. We especially love the reference to “The Most Useless Machine EVER” which gives the lamp that extra bit of personality.
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- Durr – Shivering bracelet that investigates our perception of time Durr is designed to create a haptic rhythm to make us notice the changing tempo of time and become more aware of both actions we take and the time we spend on […]
- Vincent & Emily – Two robots in a relationship struggle and emotional conflict Vincent & Emily are two self-willed robots who are in a conflict between each other and their surroundings, designed to explore solitude of a partner relationship and their […]
- Make Longer Cables – Short film by students at HfG Schwäbisch Gmünd Make Longer Cables is a short film including a custom made five-axis robotic arm that is trying to escape monotony by committing […]
- Patch of Sky – Lamps that share, in real-time, the sky above us Created at FABRICA, Patch of Sky is a set of three Internet connected ambient lamps that share, in real-time, the sky above […]
- Longhand Publishers – Design workstations for collaborative mini publications In the former building of the Newspaper BN De Stem, the installation created by Tim Knapen & indianen, allows visitors to collaboratively create mini […]
- Solar Sinter [Objects, Arduino] Amongst the wonderful collection of work currently on show at the Royal College of Art, in the corner on the first floor sits an installation/object by Markus Kayser called Solar Sinter. An MA Design Products student project, Solar Sinter is probably one of the most inspiring projects this year, aiming to raise questions about the future of manufacturing and triggers dreams of the full utilisation of the production potential of the world's most efficient energy resource - the sun. In a world increasingly concerned with questions of energy production and raw material shortages, this project explores the potential of desert manufacturing, where energy and material occur in abundance. In this experiment sunlight and sand are used as raw energy and material to produce glass objects using a 3D printing process, that combines natural energy and material with high-tech production technology. In August 2010 Markus Kayser took his first solar machine - the Sun-Cutter (see video below) - to the Egyptian desert in a suitcase. This was a solar-powered, semi-automated low-tech laser cutter, that used the power of the sun to drive it and directly harnessed its rays through a glass ball lens to ‘laser’ cut 2D components using a cam-guided system. In the deserts of the world two elements dominate - sun and sand. The sun offers the energy and sand an unlimited supply of silica in the form of quartz. When silicia sand is heated to melting point, once cooled solidifies as glass. This process of converting a powdery substance via a heating process into a solid form is known as sintering and has in recent years become a central process in design prototyping known as 3D printing or SLS (selective laser sintering). By using the sun’s rays instead of a laser and sand instead of resins used in modern 3D printers, Markus had the basis of an entirely new solar-powered machine and production process for making glass objects that taps into the abundant supplies of sun and sand to be found in the deserts of the world. The Solar-Sinter was completed in mid-May and later that month Markus took this experimental machine to the Sahara desert near Siwa, Egypt, for a two week testing period. The machine and the results shown here represent the initial significant steps towards what Markus envisages as a new solar-powered production tool of great potential. The Solar-Sinster uses ReplicatorG software, an open source 3D printing program. For more information, see replicat.org. The project is currently on show at the Royal College of Art graduate exhibition and I agree "a 'must-see' event for anyone interested in twenty-first century art and design". 24 June to 3 July 2011. Royal College of Art Kensington Gore, London SW7 2EU Project Page (Thanks to Steffen for pointing it out) Related: Known Unknowns [Processing, Objects] by @comkee + @ranzen at ... Dromolux [Processing, Objects] - Increasing cognitive […]
- lumiBots [Arduino, Objects] [Photo: S.T. Heizmann] What looks like a time-lapse recording of bioluminescent critters roaming the deep sea floor is in fact a swarm of 9 autonomous UV light emitting robots inhabiting a 1 x 2 meter phosphorescent surface. Created by Mey Lean Kronemann, a Berlin-based media artist with an interest in robotics, these lumiBots (2010-2011) tirelessly trace the fading trails of their peers. An endless pursuit that, much like a computational drawing machine, generates glowing patterns of visual complexity out of a simple system. Each lumiBot, designed to be as inexpensive and basic as possible, is equipped with an Arduino micro-controller, two light sensors, two click switches for collision detection and a UV LED that activates the glow-in-the-dark sheet. Its movement – not pre-programmed nor predictable – is based on two simple rules: follow the light (the brighter the better) and turn after collision. This efficient little set-up can trigger interesting results and surprisingly emotional reactions. Exhibited at a number of international festivals, delighted audiences saw lumiBots not only follow existing paths, but refine them, take short-cuts or wander off exploring. "People connect with the lumiBots right away," says Mey in a Skype chat. "Their movement suggests life, life suggests emotions." Easily confused by the light of an opening door, a bright iPhone screen or a camera flash, lumiBots will stir as if alarmed. "Maybe it's their helplessness that makes them so likeable. People find them cute, talk to them and even make out individuals. One might appear to be thinking, another one comes off stubborn, two others seem to feel attached to one another." And really, every now and then two lumiBots engage in a spinning dance, or inseparably continue their journey together after a rough collision. Mey's fascination with emergence, swarms, and artificial life forms was already evident in her 2006 interactive floor projection schüchterne lichter (timid lights) and it continues to spawn. Her newest species: Klackerlaken (clanking bugs), a swarm of buzzing and glowing insect-like vibrobots made of a cellphone motor, an LED and a battery, all taped to a bottle cap. Developed for a maker workshop for kids at Lab30's Kunstlabor event in Augsburg (October 2011), Mey's Klackerlaken will also infest Berlin's c-base as part of the Transmediale satellite Dorkbot event on January 30th. Go catch some! See more of Mey's work on her website and follow her on Twitter @lumibots. See also What is at stake in animate design? [Theory] and how to make […]
Posted on: 28/11/2012
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google