MetaTouch is an XR installation exploring virtual touch in which participants are provided with multisensory feedback in response to their interactions with a series of tactile modules.
In the increasingly embodied context of the metaverse, lack of tactile feedback hinders the construction of positive interaction and empathy. In response to this, we are questioning how cross-modal features and multi-sensory experiences of the physical could be of benefit to users’ navigation of virtual spaces.
Through a ‘tactile cabinet of curiosities’, participants are invited to explore a bizarre field where the virtual and the physical are deeply integrated. Inspired by Laban Movement Analysis (LMA), a series of hand movement detection modules, each with different material and sensory-kinetic properties, have been designed to capture and translate real-time touch data in order to achieve precise visual-tactile mapping.
By speculating on and playing with tactile rules and expectations in this novel context, MetaTouch questions how visual-tactile experiences can affect the embodied interpersonal relationships and social interaction in the virtual realm. MetaTouch seeks opportunities to break down touch stereotypes, re-building touch language and etiquette to explore the potential of positive touch experiences in an empathetic and embodied virtual community.