Arduino, Environment
Leave a comment

Tangibles Worlds – Adding a sense of material and touch to VR

Created by Stella Speziali at ECAL, Tangibles Worlds explores the effects of tactile experience as a catalyst for full immersion in VR. It proposes a “black box” interface, an alt-plysical universe to the VR experience, extending the immersion beyond visual and sound. Simultaneously, the project questions the perceptual correlation of what we perceive from the outside and the inside of VR and their mutual relationship.

“In a virtual reality experience, how to provide a more tangible VR experience? Is it possible to interact with the virtual universe in some other way rather than only employing (handling) the controllers provided?” –– Stella Speziali

Tangibles Worlds is an experiment that seeks to combine virtual reality, which is immaterial and digital, with touch, which is the sense that allows us to get closer to things. Stella’s project aims to make sure that the user doesn’t limit himself/herself by simply observing the virtual universe, but to interact with the material within and become attentive of the feedback through his sensations and feelings that come through tactile experience.

↑ Stella created 3 difference environment for each box.

↑ Early experiments testing different sensors and textures.

The installation is composed of three different boxes. Each box contains an IR distance sensor, which detects when a hand is inserted and display the virtual world attributed to the box. This new virtual world surrounds the user. A sensor is placed on each wall within the boxes, this sensor recognizes the hand and activates an animation inside the virtual world. In addition, Stella tried to map the sensors in the virtual so that a little clue is given to the user and will lead him/her to trigger the animations. The entire tactile and visual experience is automatically controlled in the Oculus Rift.

The full setup is comprised of three IR distance sensors, four flex sensors, four capacitive metallic wire, one motion sensor, one LED, four photocell sensors and one Oculus Rift. These are controlled using an Arduino Mega.

Project Page | ecal.ch | ECAL – Bachelor Media&Interaction Design

Project by Elise Migraine and supported by Alain Bellet, Cyril Diagne, Gaël Hugo, Christophe Guignard, Cédric Duchêne (Tutors) and Laura Perrenoud, Tibor Udvari, Romain Cazier, Marc Dubois (Assistants).

Leave a Reply

Your email address will not be published. Required fields are marked *