Tracking Gestures to Detect Gender
Nonverbal behavior is a very important part of human interactions; and how this behavior is tracked and rendered is also key to establishing social presence. Tracking nonverbal behavior is useful not only for rendering signals via avatar, but also for providing clues about interactants. In this paper we describe a novel method of determining identity (i.e., gender) using machine learning with input taken from the Microsoft Kinect. Twelve men and twelve women performed a number of gestures in front of the Kinect. A logistic regression used ten posture and gesture features (e.g., angle between shoulders and neck) to predict gender. When presented with a person it has never seen before, the model was 83% percent accurate in predicting whether the person was a man or a woman, even from very short (i.e.,ten seconds) exposures to the test participants. We discuss the usefulness of the current research tool for presence, as well as point out practical applications.