Skip to main content Skip to secondary navigation

Multisensory Mixed Reality

Main content start

With the advancement of virtual and augmented reality hardware, we see the next generation of immersive technologies to be multisensory. This means our interactions will be enhanced not just visually, but also through other senses such as hearing, touch, smell, and taste. While the hardware for multisensory experiences is still in its early stages, the understanding of how these sensory augmentation affect social perception and behavior should start today. 

For instance, VHIL researchers investigated how people express and interpret emotions through haptic devices and the interplay between haptic and olfactory cues on eating behavior in VR. 

Moreover, we explore multisensory experiences that extend beyond our biological perceptual abilities, such as sharing private touch across individuals and utilizing auditory cues to augment our perception of others. 

 

For more information

For more information, contact Yujie Tao (yjtao@stanford.edu).