Automatic Detection of Nonverbal Behavior Predicts Learning in Dyadic Interactions
Nonverbal behavior can reveal the psychological states of those engaged in interpersonal interaction. Previous research has highlighted the relationship between gesture and learning during instruction. In the current study we applied readily available computer vision hardware and machine learning algorithms to the gestures of teacher/student dyads (N 1/4 106) during a learning session to automatically distinguish between high and low success learning interactions, operationalized by recall for information presented during that learning session. Models predicted learning performance of the dyad with accuracies as high as 85.7 percent when tested on dyads not included in the training set. In addition, correlations were found between summed measures of body movement and learning score. We discuss theoretical and applied implications for learning.