Skip to main content Skip to secondary navigation
Journal Article

Perceiving visual emotions with speech

Embodied Conversational Agents (ECAs) with realistic faces are becoming an intrinsic part of many graphics systems employed in HCI applications. A fundamental issue is how people visually perceive the affect of a speaking agent. In this paper we present the first study evaluating the relation between objective and subjective visual perception of emotion as displayed on a speaking human face, using both full video and sparse point-rendered representations of the face. We found that objective machine learning analysis of facial marker motion data is correlated with evaluations made by experimental subjects, and in particular, the lower face region provides insightful emotion clues for visual emotion perception. We also found that affect is captured in the abstract point-rendered representation.

View PDF

Z. Deng
J.N. Bailenson
J.P. Lewis
U. Neumann
Journal Name
Proceedings of the 6th International Conference on Intelligent Virtual Agents
Publication Date