Stanford University

Perceiving visual emotions with speech

Deng, Z., Bailenson, J.N., Lewis J.P., & Neumann, U. (2006). Perceiving visual emotions with speech. Proceedings of the 6th International Conference on Intelligent Virtual Agents. California , USA . 21-23 August.

View PDF


Embodied Conversational Agents (ECAs) with realistic faces are becoming an intrinsic part of many graphics systems employed in HCI applications. A fundamental issue is how people visually perceive the affect of a speaking agent. In this paper we present the first study evaluating the relation between objective and subjective visual perception of emotion as displayed on a speaking human face, using both full video and sparse point-rendered representations of the face. We found that objective machine learning analysis of facial marker motion data is correlated with evaluations made by experimental subjects, and in particular, the lower face region provides insightful emotion clues for visual emotion perception. We also found that affect is captured in the abstract point-rendered representation.

Comments are closed.