Created by Karl D.D. Willis, Beautiful Modeler is an iPad and Desktop openFrameworks application for gestural sculpting using iPad as a multi-touch controller. Each finger is used to control a single touch point in the model, with multiple layers working to build up 3D volume. As the controller is connected over the wireless network, it can be moved freely to change the viewing angle of the model using iPad’s accelerometer.
The model itself is presented on the main display rather than on the controller itself; this prevents occlusion of the model when sculpting with the whole hand. The controller screen does not need to be viewed while sculpting, meaning the controller can be rotated or flipped to sculpt from a range of angles. Currently the model is constructed using metaballs (thanks to Golan‘s code), but this is just one approach for gestural input to be transformed into geometry.
Because both Beautiful Modeler and the Beautiful Controller were created using openFrameworks, the finished mesh can be exported as an STL file (thanks to ofxSTL), meaning the sculpted form can be fabricated immediately. In the video above, the positive mesh has been post-processed to create a negative form for fabrication with a plaster-based 3D printer.
- Trace Modeler [openFrameworks] Created by Karl D.D. Willis, Trace Modeler is an application that uses real-time video to create three-dimensional geometry. The silhouette of a foreground object in a video frame is subtracted from the background and used as a two-dimensional slice. At user-defined intervals new slices are captured and displaced along the depth axis. The result is a three-dimensional model defined by silhouette slices over time. Trace Modeler was built using the openFrameworks and the OpenCV library to recognize contours from the video image. Source code is available for download here. Project Page (re-descovered via Cedric Kiefer) See also Beautiful Modeler [iPad, […]
- Björk – Biophilia – Virus [iPhone, iPad, Sound] This week saw the release of 'Virus', the new in-app purchase from Björk's forthcoming 'Biophilia' app-album created in collaboration with Scott Snibbe and M/M (Paris). As expected the new Virus release does not disappoint. We are handed a mesmerising viral system that draws you into the beautiful interactive musical experience. As always we wanted to know more, so we got in touch with Scott and got some wonderful insight into the development of the app including early sketches, code/libraries, inspiration images and sketches by Bjork and Scott. Read on for details.. Virus The 'Virus' was engineered from September, 2010 through July, 2011. The overall Biophilia project, including Virus, was engineered in Cocos2D for ease of transitions between song app experiences. Virus itself is a hybrid of several graphics and simulation models, and was programmed by Scott Snibbe and software engineer Graham McDermott. Scott build the first prototype (up to the images you see below from February). Then Graham worked for several months refining it. At the end Scott added a few tweaks including the DNA strand simulation and refined some elements of the physics, interactivity, and textures. 1.The Viruses are pressed together using an offscreen “trash compactor” that squeezes in from four sides. 2.Prototype of hand-drawn Ink look for Virus. 3.Rough early textures in a textbook style for Virus. 4.Virus textured with Drew Berry prototype textures, on its way to the final look. The core physics engine for cell movement is based on the unrestricted (but undocumented!) library JellyPhysics by “Walaber” (Tim FitzRandolph). The team modified this library and fixed various bugs to adapt to application. The cells are pressed together using an off-screen “trash compactor” comprised of four walls that push in from the sides to squeeze all the cells together. 1. Storyboard and concept sketches for Virus, clockwise from upper left: packed cells, singing nuclei, DNA attack the nucleus, DNA strands entering cell walls. 2. A page from Snibbe’s notebook with calculations for cell physics. Physics for the nuclei is hand-done, and physics for the simulated DNA strands is accomplished with a custom spring and mass physics library Scott has worked on for about twenty years." Physics engines are a bit like poetry engines in my opinion – to really get the precise behavior you want, you need to implement from scratch, or make significant changes. There are an infinite number of ways to perform simulations, even ones as simple as spring-and-mass." The textures for cells are layers of custom textures created by Nathan Heigert, designer in Scott's studio. They are layered together and animated to create a richer, more life-like appearance, and there are specific textures for different scales. Scott points out that because Cocos2D is limited to OpenGL 1.1, the team had to use old OpenGL tricks for the blending modes, rather than custom shaders. Rough sketch by Björk of the Virus score used to explain the song structure during early meetings. Virus graphics and animations were created using Cocos2D sprites, animations, and texture sheets, and produced using Photoshop and After Effects. The audio for Virus and the other apps is created using the FMOD library, a robust audio library for gaming that can support hundreds of simultaneous mixed tracks, precise synchronization, and real-time DSP effects. 1. Protools screenshot of vocal and hang tracks used for Virus’ music logic to stretch or compress the duration of the song, and mark transitions during the infection and attack. 2. Page 22 of the traditional musical score for Virus, used for planning and synchronization. Inspiration Images 1. David Goodsell Virus illustration - Virus inspirational illustration from talented bio-illustrator david Goodsell. Watercolor on paper. 2. 3D Virus model from Drew Berry, creative consultant to the project. 3. Images from video by Drew Berry of cells being infected. 4. Microscopic photograph of stem cells under microscope. Thanks to Scott for providing all these details. If you haven't already, make sure you download free Biophilia app from the AppStore (link below), including both the 'Virus' in-app purchase described here and Crystalline we mentioned few weeks back. Platform: iPhone/iPad (Universal) Version: 1.0 Cost: Free + $1.99 per in-app purchase Developer: Second Wind Ltd Screenshots: Viruses massing for attack of the mother cell. Surrounding cells nuclei sing to the chorus as viruses mass menacingly on the mother cell. DNA strands gracefully move in for the kill. Viruses and DNA coexist happily in instrument mode, producing gameleste and hang […]
- Second Surface – Multi-user spatial collaboration system Collaborative drawing in 3D space has a long history at MIT and on CAN. Starting in the early 2000's where translucent screen was used to observe three-dimensional objects drawn in space (sorry no link) to our own GD3D app published on the AppStore in september 2010 (free). There is general interest in populating virtual environments and we have seen a large number of projects in the past that do just this. Unfortunately this virtual space is very fragmented, each project relying on its own interface and technologies not to mention different devices used to navigate it. Tangible Media Group at the MIT have developed T(ether) – a Spatially- and Body-Aware Window and the latest iteration of the research comes in the form of Second Surface created by the new team including Shunichi Kasahara (Sony Corporation, MIT Media Lab), Valentin Heun, Austin S. Lee and Hiroshi Ishii (MIT Media Lab). The project aims to create an environment for creative collaboration which can adapt to everyday environment. Second Surface, a novel multi-user augmented reality system fosters a real- time interaction for user-generated contents on top of the everyday environment. This interaction takes place in the physical surroundings of everyday objects such as trees or houses. Second Surface uses image based AR recognition technology that recognizes natural image as the target. Based on the AR recognition, they estimate the pose of user’s device from the target object. Then their system allows users to place three dimensional drawings, texts, and photos relative to such objects and share this expression with any other co-located users who use the same software at the same spot. Our system can provide an alternate reality space that generates playful and natural interaction in an everyday setup for multiple users. Our goal is to create a second surface on top of the reality, invisible to the naked eyes could generate a real-time spatial canvas on which everyone could express themselves. We believe that our system can create an interesting and new collaborative user experience and encourages playful content generation. We also believe that our system can provide new ways of communicating within everyday environment such as cities, schools and households. The Second Surface, is collaborative work between MIT Media Lab and Sony Corporation and was introduced at the Emerging Technologies, SIGGRAPH ASIA 2012 and was awarded the Emerging Technologies Prize. Client mobile application and server application were built using OpenFrameworks. More about the project […]
- AI Controller [iPad] Created by Aircord lab team, same group behind the mobile runner app we wrote about few months back, AI Controller is an iPad application designed to control Box2D physics engine projected onto a building. Controlled by OSC to a openFrameworks desktop application projecting the image, windows layout is mapped to the iPad oF application where you can drag the particles and adjust colour of the projected image. See movie below for demo... (Thanks […]
- Bit Pilot [Mac, Games, iPhone, iPad, oF] Zach Gage has just announced that Bit Pilot 2.0 for iPhone/IPad is about to hit the AppStore. To celebrate, Zach has made a desktop version which is available for download now (link below). It is the first application to use Apple's Magic Trackpad as a wireless controller and although you can still use your MacBook's trackpad as input (tested and working great), it is with magic trackpad where the game really shines. Bit Pilot for Mac is nearly a full game for free and includes all the features of the original Bit Pilot 1.0. Download (Mac) The Bit Pilot version 2.0 for iPhone/iPad launches this evening. It includes retina display support, 3 New Tracks from Sabrepulse, 2 New Modes, New Taunts, Vs. Friends Ranking through openFeint, Game Center Support, Cumulative Score Tracking and In Game Achievements Screen. See preview videos below. For those that missed our review of Bit Pilot, you can read it here If you enjoyed Canabalt, fast paced, heart pumping action, Bit Pilot will fit right into your collection of games on the iPhone. If I was going to give one gameplay advice, it is don’t panic! Take a deep breath, sit back, relax, thumbs on the screen and enjoy. Bit Pilot is absolutely fantastic! Project Page Platform: iPhone Version: 2.0 Cost: $0.99 Developer: Zach Gage Also, check out recent Interview with Zach Gage at […]
- iOSC [iPhone] iOSC is a remote control application that uses the OSC (Open Sound Control) protocol. Using the OSC protocol over your deviceâ€™s built-in Wi-Fi connection, iOSC communicates with other compatible hardware and software nodes on your network. You can also remote control the middleware such as Max MSP, Processing, ActionScript (FLOSC) and many other devices that support the OSC protocol from your iPhone. Similar to TouchOSC and few other applications available in the AppStore,Â iOSC allows you to create a custom interface/controller for your desktop applications or custom build devices using boards like Arduino. What is unique toÂ iOSC is that you can control multiple computers via a single interface. See movies below for demo. You can also find great video demos and instructions on the app's site. Platform: iPhone Version: 1.01 Cost: […]
- Fabricate Yourself [openFrameworks, Kinect] Fabricate Yourself is a project by Karl D.D. Willis that documented the Tangible, Embedded and Embodied Interaction Conference. Given the tangible theme of the conference Karl decided to engage the community by capturing and fabricating small 3D models of attendees. Attendees firstly capture their favorite pose using a Microsoft Kinect. The depth image from the Kinect is processed into a mesh and displayed onscreen in real-time. At any time they can capture the mesh and save it as an STL file. Dovetail joints are automatically added to the side of the 3x3cm size models so they can be snapped together. This allows multiple models to be connected to form a larger overall model, like a jigsaw puzzle. The STL files were printed using a Dimension uPrint 3D printer provided by Stratasys. Created using openFrameworks. Project Page (Thanks Karl) Previously: Beautiful Modeler [iPad, openFrameworks] - Gestural sculpting on ... Trace Modeler [openFrameworks] - Real-time video to create […]
- Touch Vision Interface [openFrameworks, Arduino, Android] Created by Teehan+Lax Labs, Touch Vision Interface is a combination of software and hardware to allow realtime manipulation of content on a remote device via touch interface on a mobile device. Instead of purely using mobile device screen as an input, the user views the remote content and applies the content simultaneously, better know but not necessarily a form of AR. I can still recall the first time I saw an Augmented Reality demo. There was a sense of wonderment from the illusion of 3D models living within the video feed. Of course, the real magic was the fact that the application was not only viewing its surrounding environment, but also understanding it. AR has proven to be an incredible tool for enhancing perception of the real world. Despite this, I’ve always felt that the technology was somewhat limited in its application. It is typically implemented as output in the form of visual overlays or filters. But could it also be used for user input? We decided to explore that question by pairing the principles of AR (like real-time marker detection and tracking) with a natural user interface (specifically, touch on a mobile phone) to create an entirely new interactive experience. The translation of touch input coordinates to the captured video feed creates the illusion of being able to directly manipulate a distant surface. Peter imagines future applications of this technology both in the living room or in large open spaces. Brands could crowd-source easier with billboard polls, group participation on large installations could feel more natural. Likewise other applications could include music creation experience where each screen becomes an instrument. The possibilities become even more exciting when considering the most compelling aspect of the tool – the ability to interact with multiple surfaces without interruption. No need to switch devices through a secondary UI – simply touch your target. You could imagine a wall of digital billboards that users seamlessly paint across with a single gesture. Created using opencv-android, openframeworks and python/arduino for the led matrix. Touch Vision Interface (Thanks […]
Posted on: 04/11/2010
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google