Conversational Dynamics in Social Virtual Reality: A Large-Scale, Longitudinal Study of Speech Acts and Nonverbal Behavior
Abstract
This study explores verbal interactions in social virtual reality (VR). We developed the Virtual Reality Interaction Dynamics Scheme (VRIDS), which includes 10 speech acts (e.g., questioning, opinions, disagreements) by integrating existing speech frameworks with new constructs from our data. Analyzing speech from 109 participants in a metaverse classroom over four weeks, we coded 9,738 discourse units. VRIDS introduced new constructs like context-dependent commentary on virtual objects, which increased as users became more familiar with the technology over time. Idea sharing also increased over time, indicating enhanced collaboration. We identified attractor and repeller states through speech act sequences and an "echoing" strategy. Additionally, we examined the link between nonverbal behavior and speech acts; head and hand movements increased during questioning and context-dependent commentary but decreased during disagreements. Our database includes transcripts, speech acts, and nonverbal data. Future work will train large language models to recognize speech acts in VR.