Multimodal Analytics in Virtual Reality (In press)
Abstract
In this chapter, we first delve into the nature of sensory perception and survey the current evidence for how multisensory input influences the efficacy of VR applications in educational contexts, including haptics, scent, and auditory cues. These features can enhance the educational experience by increasing immersion, presence, or embodiment– distinctive qualities of virtual environments that differentiate them from non-immersive experiences (Kilteni et al., 2012; Lombard & Ditton, 1997; Slater & Wilbur, 1997). Next, we explore dynamic signals originating from the user, which could range from hand and head motion tracking, to psychophysiological signals (e.g., eye tracking, pupillary dilation, heart rate) that provide insight into the user’s mental state. We also delve into how incorporating these diverse data streams may allow educators to create adaptive learning experiences that enhance learner engagement by providing insight into the momentary and long-term efficacy of immersive learning experiences. After discussing two theoretical frameworks of particular relevance within this domain, we transition to a more applied level by discussing the practical implications of this work as well as current obstacles to the widespread adoption of these techniques. Lastly, we offer a roadmap for scholars aiming to integrate these tools into their work, and discuss promising avenues for future research.