Palm Top Theater puts 3D movies in the palm of your hand using peppers ghost technique. It is also an exhibition consisting of a device called i3DG. i3DG derives from the words”I”, “3D”, and “Gadget”.
i3DG is a playful analog extension to an iPhone or an iPodTouch, converting its 2D display into a layered 3D view. Using the old technique of placing a half-silvered mirror at a 45-degree angle in front of an image, in a new context, the project extends upon 3D displays and iPhones. As a peripheral gadget, i3DG can support a wide range of different applications, from 3D videos and animations to accelerometer-based games.
i3DG was invented by Jitsuro Mase, a media artist, and produced by DIRECTIONS, Inc, a Japanese media production company. It has won a prize of an international competition, Ars Electronica Festival 2010 Honorary Mention.
This device lets you enjoy 3D movies right in the palm of your hand without 3D glasses or other gear. Both production and playback of the visuals do not require special technology or equipment, such as shooting imagery with 2 cameras or using complicated CG technique for image production.
Read more about peppers ghost technique on wikipedia.
The Palm Top Theater team is also preparing new exhibition and workshop in Rotterdam. More details here.
(via Chris O’Shea)
- Little Magic Stories [openFrameworks, Kinect] Little Magic Stories is the latest project by Chris O'Shea, with aim to encourage children to use their creativity to bring stories to life. The installation allows them to create a performance from within their imagination, on stage, in front of an audience of family and friends. Chris writes: This version This is the first version of the project to test the idea and build the system. This story about the seasons was created entirely by the children, with the interactivity in the scenes built by me. Some scenes used motion detection in zones to trigger animations, such as catching Easter eggs, squashing sand castles or launching fireworks. Body tracking and basic physics were used in other scenes. The future I am planning to use this project in workshops with groups of children to get them excited about storytelling. They will be able to use the system to create their own narratives, as well as drawing the content by hand, before performing to their friends. The system will have improved physics, dynamic animation of objects and scene animated sounds. Chris used the Musion Eyeliner holographic projection system for this project, allowing the graphics to appear to be alongside the performers. This uses a technique called Pepper’s ghost, and you can see the technical set-up here. An Xbox Kinect camera was also used to track the performers on stage. The software was custom written in C++ and used openFrameworks, openCV and Box2D. Project […]
- ScanLAB – 48 Hours of Exhibition Space Scanning [Events] FABRICATE is an International Peer Reviewed Conference with supporting publication and exhibition to be held at The Bartlett School of Architecture in London from 15-16 April 2011. Discussing the progressive integration of digital design with manufacturing processes, FABRICATE will bring together pioneers in design and making within architecture, construction, engineering, manufacturing, materials technology and computation. Part of the exhibition is the work of ScanLAB, a research group run by Matthew Shaw and William Trossell at the Bartlett School of Architecture that explores the potential role of 3D scanning in Architecture, Design and Making. In 2010, 48 hours of scanning produced 64 scans of the Slade school's entire exhibition space. These have been compiled to form a complete 3D replica of the temporary show which has been distilled into a navigable animation and a series of ‘standard’ architectural drawings. The work becomes a confused collage of hours of delicately created lines and forms set within a feature prefect representation of the exhibition space. Sometimes a model or image stands out as identifiable, more often a sketch merges into a model and an exhibition stand creating a blurred hybrid of designs and authors. These drawings represent the closest record to an as built drawing set for the entire exhibition and an ‘as was’ representation of the Bartlett’s year. The 3D model was produced using a Faro Photon 120 laser scanner ($40k). Software that enables navigation is Pointools, generic point cloud model software that allows for some of the largest point cloud models - multi-billion point datasets. scanlabprojects.co.uk For more information on FABRICATE, see http://www.fabricate2011.org Exhibition Private View 6pm – 14th April 2011 Bartlett School of Architecture Gallery Wates House, 22 Gordon Street London WC1H 0QB For tickets, see fabricate2011.org/registration/ (Thanks Ruairi) See also Fragments of time and space recorded with Kinect+SLR on NYC Subway ... and CITY OF HOLES on […]
- How It Is [iPhone, Events] Created by Champagne Valentine, How It Is app is an interactive interpretation of Miroslaw Balka's new work currently being exhibited at London's Tate Modern gallery. Immerse yourself in the dark and mysterious world of How It Is – the new Unilever Series commission for Tate Modern’s Turbine Hall by Polish artist Miroslaw Balka. Use the custom-made "thumbpad" joystick to explore the 3D world. Tapping will trigger more audio and visuals. The interactive 3D sound is central to the experience so be sure to use headphones and turn it up! more.. The app also includes a video interview with the artist and the curator's statement. In addition, taking the app to Tate Modern and opening it there will unlock a secret game level.Interesting work by the Tate and we can be sure to see many more apps appear in the future promoting tate events. Get it, it's free + fun! Tate Modern 13 October 2009 – 5 April 2010 Platform: iPhone Version: 1.0 Cost: Free Developer: Tate (via […]
- Music Boxel [iPhone, iPad, openFrameworks] Latest installation/project by Aircord, building on their 3D projected screens, includes a Naked-eye 3D Display with Web Socket based multi user interaction. The setup works around allowing multiple users to scan QRcode, be redirected to a website accessible via mobile Safari browser and interacting simultaneously with a 3d display by adding a voxel to make a trigger on a timeline of virtual music box. Display/Audio-Visual Program by aircord inc. made with : openFrameworks, SuperCollider Server/Mobile Interface by Uniba Inc. made with : node.js, […]
- The Muybridgizer [iPhone, oF, Events] Commissioned by the Tate Britain and in honour of the eccentric man who proved that ‘horses can fly’, Theo Watson and Emily Gobeille with Nexus productions have created an iPhone app utilizing Muybridge’s technique to give iPhone users the chance to animate their own Muybridge-esque sequences, using custom-built ‘Muybridgizer’ app. Using the original principle of taking a sequence of still shots (in Muybridge’s time, by setting up a bank of cameras along a racetrack which were triggered by a galloping horse) the iPhone camera will capture a sequence of images which the app then allows you to manipulate frame by frame, treating the images so they are in a similar vintage style to Muybridge’s work and animating them. Sequences can then be shared by email or on Flickr, with any images uploaded to Flickr being featured on the exhibition’s microsite. Of course, this is a must see exhibition also. Project Page muybridgizer.tate.org.uk Eadweard Muybridge (www Tate) Tate Britain 8 September 2010 – 16 January 2011 British born artist and programmer Theo Watson met motion graphics director Emily Gobeille while attending the design and technology MFA program at Parsons. They went on to collaborate across a number of projects including RISE AND FALL, an interactive animated story, which users are able to control through the use of a real world object, in this case a magazine cover. Theo has been influential in the development of openFrameworks, which collaborated on the ground-breaking EyeWriter initiative. Emily has been working as a designer for the past nine years. Working in motion graphics, concept development, interaction design and user interaction, her experience in design spans many disciplines, including web, print, TV, wireless platforms and installations. Clients include U2, Onitsuka Tiger and Nickelodeon to name a few. Platform: iPhone Version: 1.0 Cost: Free Developer: Tate See also Zoetrope [#iPhone] by Memo […]
- Tentacle [iPhone, Events] Tentacles is an iPhone/iPod touch application designed for participation in a multi-user, location-based game projected into public spaces. Co-developed by the Canadian Film Centre Media Lab, the OCAD Mobile Lab, and the York University Mobile Media Lab, with support from CONCERT, tentacles can be played indoors or out â€“ projected on walls, in theatres, stadiums, giant outdoor screens, or on the side of a building. Players are immersed in an inky pool of darkness found deep near the ocean floor. Each player controls a Squid-like form evoking primitive sea creatures in search of life sustaining micro-organisms dubbed â€œtenticules.â€ As your creature grows in size, players are subject to the presence of other oceanic avatars. Played in a world for all to see, the shared display reveals what happens as the actions of individuals co-exist in a sea of othersâ€™. As creatures interact, players must make the decision to â€œshareâ€ or â€œscare,â€ for as they collide, a tentacleâ€™s tip is capable of stealing valuable â€œtenticulesâ€ inhibiting a creatures ability to grow. The initial prototype forÂ Tentacles will have its public premier at two outdoor locations as part of the Scotiabank Nuit Blanche festival in Toronto from dusk to dawn tomorrow, October 03, 2009. Look for Tentacles at the CFC Media Lab Prototype exhibition at the Lennox Gallery (12 Ossington Avenue). If you live in Toronto, we would love to hear what you thought of it. For more information see tentacles.ca Platform: iPhone Version: 1.1 Cost: Free Developer:Â CFC Media […]
- 3D Me [iPhone] 3D Me allows you to create 3D-effect pictures with just a few clicks. No 3D glasses needed. A task that would be quite difficult toÂ accomplishÂ (i tried) without the app, 3D Me helps you align your centre of vision so it is the background that shifts.Â 3D Me makes this easy by letting you look through a transparent version of your first photo to use as a guideline for taking your second photo.Â By flicking the image back and forth (animated gif) the illusion of 3d space is created. You can view pics on your iPhone or share them with friends by email or uploading your pics directly to MySpace.Â To share 3D Me pics, you upload them to a private drop folder at http://drop.io (process automated by the app) and then email links to your friends. 3D Me was created with the help of these open source projects:Â DropKit,Â libGD (license),Â libpng (license),Â ObjectiveFlickr (license),Â zlib (license) Here are some examples including the video of the process created by the team behind the app. Platform: iPhone Version: 1.1 Cost: $1.99 Developer:Â Rooftop Collective See alsoÂ QuadAnimator [Flash, […]
- High Arctic by UVA [c++, Events] Photos by John Adrian Currently on display at the National Maritime Museum is a collaboration between United Visual Artists (UVA) and Cape Farewell, an installation High Arctic. This is an exhibition with no touchscreens, no static photographs, and no panels with text: instead High Arctic is an immersive, responsive environment. As you approach the entrance you are given an ultraviolet torch, met by darkness and an overwhelming array of columns of varying height occupying the space. Ultraviolet torches unlock hidden elements whilst constantly shifting patterns of interactive projections react to visitors approaching. As you "embark on this journey" of discovery, Max Eastley and Henrik Ekeus's generative soundscape flow through the gallery, weaving in the voices of arctic explorers across the centuries... The project began in 2010 when UVA’s Matt Clark travelled with the arts and climate science foundation Cape Farewell to the Arctic archipelago of Svalbard, which lies between mainland Norway and the North Pole. Sailing aboard The Noorderlicht, a 100-year old Dutch schooner, Matt’s trip brought him into contact with scientists, poets, musicians and polar bears. He saw vast tundra, monochromatic rainbows and huge chunks of ice falling from calving glaciers. He saw vast tundra, monochromatic rainbows and huge chunks of ice falling from calving glaciers. Conceived as a response to the expedition, High Arctic uses a combination of sound, light and sculptural forms to create an abstracted arctic landscape for visitors to explore. UVA Creative Director Matt Clark’s response to his experience of the Arctic as part of Cape Farewell’s 2010 expedition. UVA’s in-house tool (D3) is the main ‘glue’ for this process, however a multitude of other tools were used to explore the various iterations of the physical and digital build. These included, various scale models built in polystyrene, lego, 3D renderings and a full scale pool mock up built on the ground floor of the UVA studio. Likewise, various tools were written or integrated into D3’s existing capabilities to produce generative content for the interactive pools; Houdini (Houdini Ocean Toolkit) for producing realistic source wave depth maps, SVG handling, dither pixel shaders, video sprite management, openCV (for contour finding), a port of Memo’s Navier Stokes fluid algorithm, box2D, particle systems, lens & shift correction algorithms. Making the physical sculpture integrate with the digital projection pools was important for creating a more seamless landscape. CAD designs were imported into D3 to allow the testing of various physical setups with generative content before fabrication of the columns. Interaction is made up of ten Basler GigE with various cut and pass filters, plus 250 UV torches. The system builds upon existing D3 libraries for multi-camera 2D & 3D tracking. Lighting is created with Source4’s plus Martin Tripix strips, controlled by D3. The physical build incorporates hundreds of columns with UV reactive paint, a 40m stretched mirror and a good amount of timber and metal. 58 channels of generative and pre-composed audio are managed by Super Collider, Max/MSP and Apple Logic, which give a constantly evolving narrative across the room. The installation is run by a cluster of six D3 machines. A mix of custom protocols, web services and OSC integrate the various components. Coding for D3 is in C++, HLSL & Python. Special credit: Luke Malcolm for the bulk of the coding (he worked on the camera tracking system, show synchronisation, and interaction in three of the five pools). It’s 2100 AD and the Arctic landscape we once took for granted has changed forever. How will we choose to remember our Arctic past? Is it possible to travel somewhere that no longer exists? Set in one of many possible futures High Arctic conveys the scale, beauty and fragility of our unique Arctic environment through an immersive installation which fills the entire 820m2 gallery space. Intended to be a future vision of a receding world, it encourages us to question our relationship with the world around us. Admission: Full £6.00 / Concession £5.00 / Children (age 7+*) £4.00 (*children 0-6 go free) Dates: 14 July 2011–13 January 2012 Opening times: every day, 10.00–17.00 (closed 25–26 December) Venue: Special Exhibitions Gallery, Sammy Ofer Wing (National Maritime Museum) United Visual Artists | Cape […]
Posted on: 12/11/2010
- Senior Digital Designer at CLEVER°FRANKE
- Interaction Designer at Carlo Ratti Associati
- Creative Technologist at Deeplocal
- HTML / CSS Developer at Resn
- Climate Service Data Visualiser at FutureEverything
- Web Developer at &Associates
- Creative Technologist at Rewind FX
- Coder to collaborate with Agnes Chavez
- Data Scientist at Seed Scientific
- Data Engineer at Seed Scientific
- Design Technologist at Seed Scientific
- Creative Technologist, The ZOO at Google