Created by Mithru Vigneshwara, Aleph of Emotions is an interactive object that allows users to view worldwide emotions collected from tweets collected over 35 days in 2012. Using openFrameworks, Arduino, GPS module and a smart phone (Samsung device in video, iPod Touch later – image above) the camera-like interface allows users to point along a particular direction, focus to a place along that direction and click to view a visualisation of emotions in that place. The intention is to explore and find patterns in human emotions with relation to space and time.
Project is part of Coded Transformations, an exhibition opening 10 January 2013 6:30 PM at the Institute of Contemporary Arts Singapore and featuring also Andreas Schlegel, Dhiya Muhammad, Vladimir Todorovic, Mohamad Riduan, Mithru Vigneshwara, Judith Lee.
See the exhibition’s in-progress images on Andreas’ blog.
- N Building.app [iPhone] N Building is a commercial structure located near Tachikawa station amidst a shopping district. The team at Qosmo working together with teradadesign architecture studio thought of using QR Code (two-dimensional bar code) as the facade of the building. By reading the QR Code with your mobile device you can obtain up to date shop information but the fun doesn't end there. Using the iPhone with specially developed application you can see what is happening inside the building with people's comments made on online appearing in speech bubbles. You can also browse shop information, make reservations and download coupons (see video). The building is detected in real time by its shape. Characters are then superimposed over the live video. Twitter feed comments are located via GPS tagging. Store information, reservations and other infrastructure is part of the iPhone application. The iPhone application is not for sale in the iTunes App Store, but is available to interested parties on request. The project is a collaboration between teradadesign + Qosmo. For more information see this post by Nao Tokui, CEO of Qosmo Inc. (Thanks […]
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- Introspectre and Optocoupler [Objects] In addition to Dromolux we wrote about few months back, Ludwig Zeller has just completed two new pieces as shown at this years RCA Design Interactions end of year show. These pieces, part of the "New Needs in an Augmented World" collection, are Introspectre and Optocoupler. They are a series of products designed for a speculative future information society where one finds it difficult to consume information. Optocoupler The purpose of this device is to relax the user's mind and to get away with restless thoughts. Designed as an isolation chamber, the OPTOCOUPLER shows a light stimulation at a frequency that matches the beta waves of the brain. The more intense the external stimulus the more likely the brain will adapt to it. In a world of increasing bandwidths and communication we all face the necessity to process more information. But the human evolution is developing much slower than the speed growth of our information technology. Where the DROMOLUX device was helping seniors in the futures to maintain or increase their cognitive ability - something that will be painful to loose for them - the OPTOCOUPLER is designed to help to relax one's mind. For instance after a day full of information processing. The device uses an array of 3x3 RGB LED matrices and 9 Rainbowduinos. The original firmware has been hacked by Ludwig and David Chatting, with initial hints and code from the great Rainbowduino wizards neophob and Scott Christopher. The box features a headphone socket in order to use additional auditive signals as well. -- Introspectre Recognising the problem of focusing when we do things, with Introspectre Ludwig explores our habits of dealing with information in the form of hypertext associations, instant gratification through search engines and permanent access to these. This device is also using neuro-feedback, but not to calm down, but in order to help its user to focus on a task. An EEG headset picks up the brainwaves, which are then analysed for the degree of attention that the user is paying at any moment. This data is then sonified and played back through a speaker that is meant to be sitting next to its user as a desktop device. Ludwig was building from earlier work from the ITP in New York. They found out that the toy headset "MindFlex" by Mattel features an EEG amplifier/processor IC by NeuroSky that is identical to a developer version they sell, but much cheaper. That headset can be read via serial quite easily, so I was able to get its attention data into Max/MSP. Also important to mention that these prototypes were developed to a point where the concept can be experienced in an interesting way. His main focus is to speculate about scenarios of future information societies, so the objects are meant to be visualisations of these ideas, not actual scientific proposals. For more information see […]
- Elevation [Mac, Windows, Processing] Created byÂ Dave Shea,Â Elevation is a free, open sourceÂ Mac and Windows application built using Processing that allows youÂ visualise GPS data in 3D space. Youâ€™ll need to have a GPS-equipped phone or device capable of tracking your activity as you run, hike, cycle, skate, ski, snowboard, or whatever other physical activity you choose to map. Youâ€™ll also need the ability to export that data as XML, in either GPX or KML format. (If you have files in just about any another format, you can probably use GPSBabel to convert them to GPX files and get them working with Elevation.) Elevation knows how to work with XML files in the GPX and KML formats. Due to inconsistent file structures, it may not work with every GPX or KML file; so far Elevation has been tested with files from the iPhone app RunKeeper Pro, Nokiaâ€™s Sports Tracker, and files converted to GPX using […]
- Paint With Your Feet [openFrameworks] For the launch of the Nike Free Run+ 2 City Pack series, YesYesNo was invited to develop software that would allow runners to create dynamic paintings with their feet using their Nike+ GPS run data. During the two day workshop at Nike headquarters, the team invited participants to record their runs and then using their custom software the team imported the metrics from their run, to create visuals based on the speed, consistency and unique style of each person's run. Using the software the participants were able to play with the mapping and adjust the composition of their run which was then outputted as a high resolution print for them to take home. We also worked with the Innovation Lab at Nike to laser etch the runner's name, the distance they ran and their run path onto a custom fabricated shoe box, which contained a pair of the 'City Pack' shoes from their city of origin. Produced in collaboration with DualForces. YesYesNo Team included Zach Lieberman, Emily Gobeille and Theo Watson. Project […]
- Suki Jarashi [Profile] Suki (jarashi.tv) is a student at IAMAS Institute of Arts and Sciences (Japan) in the Studio1 / Interactive Media. Since his website is "under construction", I have selected below few projects that are particularly exciting. For more projects, until the portfolio is up, check out his vimeo + his blog. Day of Cool Man Life ..is a video about the device Suki designed which is a phone that has it's own physical behaviour. For example when an "important email' is received the phone demonstrates a particular behaviour where elements of the device rotate in a specific way. When battery for example is running short the phone shows "dying" like behaviour. It is all very hard to explain ..you must watch the video below. Eye am You Eye am You is an eye glasses-shaped toy with a camera. Once the camera detects a face by the facial recognition program, the display device on the glasses shows captured eyes of the face. When user turns his head toward someone else, eyes of the user become the eyes of the person. The concept this toy is "We see reflections of ourselves in other people." It is said that a person meets various people, and grow up through projecting oneself to the others. Though this behavior is unconscious , the toy visualize it by a symbolic way. That will be a new fun experience. Icha-Icha! Thumb Wrestling Icha-Icha! Finger Wrestling is a concept toy that consists of a couple of finger puppet. On playing finger-wrestling., the puppets can play sound effects and give vibrations feedbacks which makes the battle more exciting. By using conductive fabric, the contact between two puppets can be detected for giving some feed backs such as sound, vibration and LED indicator, set in the puppet’s face, changes its brightness according to excitement of the play. Icha-Icha! Finger Wrestling is originally designed for couples and focused on giving them a precious time rather than a competition. Through playing with it, I believe that encourage couples to have a good communication between them […]
- ‘Point Cloud’ – Arduino structure by James Leng breathes weather data Created by James Leng, Point Cloud is an attempt to re-imagine our daily interaction with weather data. Even with the modern scientific and technological developments related to weather and when we can deploy sophisticated monitoring devices to document and observe weather, our analysis and understanding of meteorology is still largely approximate. Weather continues to surprise us and elude our best attempts to predict, control, and harness the various elements. Point Cloud builds on this premise, exploring new ways to interpret and understand weather data. Weather has always had a unique place in our lives, because it has a multiplicity that encompasses both the concrete and the indeterminate. It is the intangible context within which we build our lives and our cities, but it is also the physical element against which we create protective shelter. Most of the time it is an invisible network that we can see but are not aware of; yet it can manifest in a spectacle or disaster, come forward and activate our senses, make us forget our rationality in delight or fear. Point Cloud is a sculptural form defined by a thin wire mesh, driven asynchronously by 8 individual servos controlled via Arduino. As whiteness of the hanging structure begins to disappear into the background, the viewer is treated to a constantly morphing swarm of black points dancing through midair. In the current prototype, the speed, smoothness, and direction of rotation are modulated to interpret a live feed of weather data. Instead of displaying static values of temperature, humidity, or precipitation, Point Cloud performs the data, dynamically shifting between stability and turbulence, expansion and contraction. flickr […]
Posted on: 04/01/2013
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google