‘Extending the Touchscreen’ is an ITP Thesis project by Michael Knuepfel which includes seven devices designed to improve upon or supplement the functionality of capacitance touchscreens; game controller, digital signet rings, mechanical stylus, sound stylus, pulse generator, dial-a-rama and future devices.
Touchscreens like those found on smartphones and tablets have enabled a new generation of versatile user interfaces. My thesis project, Extending the Touchscreen, aims to further this versatility by using conductive materials to construct a series of physical, mechanical, and electrical devices that touch, interact and communicate directly through the touchscreen interface. My goal in constructing these external devices is to make touchscreen interactions more tactile, physical and potentially more expressive and fun.
Instead of utilising USB connection to extend the functionality of iPad and iPhone, the devices rely on the software, simple light senors or magnets to allow objects to communicate between each other. Some devices are just used as tools to unlock the iphone for example, others allow robot like objects to behave in a certain way by ‘reading’ information from iPad’s screen. See video for more + follow Michael’s blog for updates.
- Physical Touchscreen Knobs [iPhone, iPad] This is an experiment by DS-Labs (a bunch of designers at Teague), same collective that brought you DIY Soft iPhone and iPad Stylus. What I find especially interesting is that although quite limited, this little experiment actually provides a very good insight of what is yet to come. Whilst touch screen tables such as Microsoft Surface or many more incorporate camera so are able to recognise objects that are located on the surface (reacTIVision), iPhone and iPad do not have a camera so they purely rely on a touch input. This experiment demos physical knob used instead of a finger which suggests interesting possibilities for how application may behave or be designed for the platforms. Although this is a simple rotary input interface demoed to rotate a map. Combining this with some smart software it could get very interesting. I think a lot of people fail to acknowledge that the future are NOT touch screen devices but those that combine both the physical and touch input, thus 'Physical Touchscreen Knobs' gets 10 points from me. See video below. Alternatively, with a small knob, it’s natural to grab it such that your fingers actually touch the screen making it nothing more than a physical prop to provide some tactile feedback. It works great and doesn’t require much in terms of materials or time; just grab a small object and give it a try! […]
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
- The Well–Sequenced Synthesizer by Luisa Pereira The Well–Sequenced Synthesizer is a series of sequencers created by Luisa Pereira at ITP -- physical interfaces to play with musical […]
- knot(e) by Matthieu Minguet – Sound and visual interface for iPhone (ECAL) Created by Matthieu Minguet, knot (e) is a custom built device that offers a new way of using objects that we connect to our mobile devices. The prototypes, shown in the form ropes, harness themselves to revisit the new node as input data. As an extension of a conventional listening device, it allows the user to generate visual and sound compositions by the simple action of "bending the cable". The intersection of the rope amplifies the signal. Conversely, the absence of knots and curves restores the application to its original position. Each prototype has its own forms, colours, atmospheres and sound effects. They are polyester sheet halyards from wich the core has been replaced with an empty silicon tube. They are attached with two wood cylinder and custom plugs with rubber rings to the extremities. Each plug are prepared to receive specific electronic components and are easily detachable. They can be connected to one another for more complex compositions or used one at the time. A series of Five Flex Sensors attached along the tube are connected to a femtoduino board. The board communicate with the iphone trough a softModem by hiJack the audio jack. The audio Circuit is separate in two ports. The first go directly to a 3.5mm audio female jack and the second (Microphone) allows iPhone communication. Everything is connected to a 240 Mah polymer battery, a switch and a power supply. Software is managed by openFrameworks, IOS Objective-C and PureData. The openFrameworks part for conceptualize the application. The objective-C for the GUI and the audio softModem analysis. Finally, PureData for playing audio loops, generate effects and return audio flux informations. This system might work also with the personal iPod library class, which Matthieu says was not an interesting option to explore. The main technical challenge for Matthieu was to manage the size of components in a tiny cylinder volume. Components and wire management. And also to communicate with those three languages. The physical interaction of bending the rope could also be mapped to other applications for altogether different purpose. Software: openFrameworks / Objective-C / Pure Data Hardware: Arduino / Sensors / sailing Rope / iPhone cargocollective.com/matthieuminguet Tutor: Alain Bellet, Gael Hugo, Christophe Guignard ECAL / University of Art and Design, Lausanne Switzerland Bachelor Media & Interaction […]
- Longhand Publishers – Design workstations for collaborative mini publications In the former building of the Newspaper BN De Stem, the installation created by Tim Knapen & indianen, allows visitors to collaboratively create mini […]
- Tunetrace – iOS app by Ed Burton converts drawings to music Created by Ed Burton, formerly of SodaPlay, and now at Queen Mary, University of London, Tunetrace transforms photographs of drawings into […]
- What the iPad Means to Developers [News, iPhone] Today Apple announced it's long rumored new product, the iPad. Hype has been reaching epic proportions leading up to the launch, though I would say that the device meets expectations without really offering anything truly groundbreaking in terms of technology. The most surprising part of the announcement was probably the price of the device - $499. This is a shock, considering Apple's usual 'luxury' pricing on computers. But, while some people may be disappointed that it's 'basically a big iPhone', I don't think they realize the potential for a whole new breed of multitouch applications, and a slew of new usage scenarios. The simple addition of a larger screen (and a faster processor) allows for much deeper applications that just weren't possible on the iPhone. Like the iPhone, the iPad is a device you might use sitting on the couch, at a coffee shop or otherwise looking for some quick entertainment. The fact that a lot of the presentation featured Steve sitting on a comfy chair is pretty telling about how they see people using it. I believe that the biggest app areas for the iPad will be news reading and games (two areas already huge on the iPhone). Creative applications also stand to gain a great deal on the iPad. The launch presentation featured the Brushes app - Apple chose this app out of hundreds of thousands of apps because it's easy to see how much better it is with a larger screen. There's less need to zoom in / out and you can work on a larger canvas. Hopefully with a few additional brushes this app could be a very serious tool for artists. Having direct hands on contact with the screen makes digital painting so much more attractive than trying to paint with a mouse, or even a stylus. Music apps, which typically are pretty processor intensive and have a lot of on-screen controls will greatly benefit from the extra real-estate. No longer will developers have to sacrifice features simply because there isn't any more screen space left. It should be possible to make a basic studio app with a range of instruments, rather than just one-trick-ponies. Casual games should be very successful on this platform. One can imagine a scenario where the iPad acts like a board game that players can pass to each other to take their turns. Or a multitouch game where both people can interact with the screen at the same time. One thing that game developers need to start doing more is to make games that are tap and gesture based, rather than trying to do poor imitations of on-screen game controllers. It just doesn't feel right. It also remains to be seen whether people can hold this device (which is supposed to feel fairly heavy and is a lot larger than an iphone) and interact with the onscreen controls at the same time. Coming up with new types of games will take some creativity, but it's better to go with the flow than to swim against the tide - make your application fit the physical abilities of the platform you are developing for! You'll be more successful if you do. Location-based services will be pretty useless since I doubt most people will go for the 3G option. There's no camera either, so photo taking, video conferencing and augumented reality are out. The New York times developed an application to serve their content - will other magazines follow suit? Someone has to figure out how to take these companies into the digital age. In terms of new features offered by the SDK, I can't really go into detail since it's covered by a NDA. However, I will say that Apple has addressed one major gripe of creative app developers, which is that there is no built in way to exchange files with your desktop computer. This shortcoming should now be solved (and it's used by Apple's own iWorks apps). The rest of the new additions mostly focus on new GUI elements which take advantage of the extra screen space. Photos courtesy of Gizmodo Video by iLounge See also Apple iPad: Limited Options for Video Output, Visualists? on […]
- KAIST Mobile Phone Orchestra [iPhone, Processing, Sound] The KAIST Mobile Phone Orchestra (KAMPO) aims to explore the potential of mobile media for music and media art. In addition to suggesting new and innovative mobile performance paradigms through concerts, KAMPO conducts active research/education in music and mobile media as well as software development. The performances include five participants equipped with iPhones operating different components of the iPhone app, playing different instruments. Besides just triggering instrument sets in Ablenton Live, the main display application also creates a loop, as in a real orchestra, sending conduction messages back to performers and their devices. The iPhone app used in the performance is made using Apple iOS framework together with Momu Toolkit of Standford, MoPho (Mobile Phone Orchestra) for some functionality, especially OSC. The app is available for purchase on the AppStore and includes five separate interfaces (button, drawing, mic, accelerometer, compass) and one setting interface. The main display app uses Processing and receives performers' OSC messages to visualize the data. This main application sends conduction message to performers' iPhone via OSC, such as 'Start' message, buttons to press, lines to draw, direction to tilt, level to blow, direction of compass. Sounds are generated from the main computer running the main application, by using Ableton Live via MIDI message. KAMPO was the thesis (PDF) project of Sihwa Park as well as an AIM (Audio & Interactive Multimedia) Lab project. The team are currently preparing several performances using not only this app but also other applications. KAMPO Homepage. Director: Woon Seung Yeo, Co-directors: Sihwa Park, SongHee Jung Performers: AIM Lab […]
Posted on: 11/05/2011
- Engineering Lead at Wieden+Kennedy
- Web Developer at the Minneapolis Institute of Arts
- Junior Production Assistant at Resonate
- WebGL/3D Creative Prototyping Devs at TheSupply
- Freelance Interactive Producers at Psyop
- Art Director/Senior Designer at Stinkdigital
- Creative Technologist, The ZOO at Google
- Jr. / Sr. Software Developer at Minivegas
- Web Developer at Minivegas
- Digital Producer at Minivegas
- 3D Technologist at INDG
- Creative Director at INDG