top of page

Drone in Love

Emotional Perception of Facial Expressions on Flying Robots

Venue

ACM Conference on Human Factors in Computing Systems (CHI 2021)

Authors

Viviane Herdel 1,4

Anastasia Kuzminykh 1,2,3

Andrea Hildebrandt 4

Jessica R. Cauchard 1

1 Magic Lab, Department of Industrial Engineering and Management, Ben Gurion University of the Negev,

Be'er Sheva, Israel

2 Faculty of Information, University of Toronto, Canada

3 Cheriton School of Computer Science, University of Waterloo, Canada

4 Department of Psychology, Carl von Ossietzky Universität Oldenburg, Germany

Abstract

Drones are rapidly populating human spaces, yet little is known about how these flying robots are perceived and understood by humans. Recent works suggested that their acceptance is predicated upon their sociability. This paper explores the use of facial expressions to represent emotions on social drones. We leveraged design practices from ground robotics and created a set of rendered robotic faces that convey basic emotions. We evaluated individuals' response to these emotional facial expressions on drones in two empirical studies (N=98, N=98). Our results demonstrate that individuals accurately recognize 5 drone emotional expressions, as well as make sense of intensities within emotion categories. We describe how participants were emotionally affected by the drone, showed empathy towards it, and created narratives to interpret its emotions. As a consequence, we formulate design recommendations for social drones and discuss methodological insights on the use of static versus dynamic stimuli in affective robotics studies.

Methodology

To explore the recognition, interpretation, and reactions to the emotional facial expressions on drones, we conducted two empirical studies, both employing a mixed-methods approach. 

Key Results

We showed that people can recognize five basic emotions: Joy, Sadness, Fear, Anger, and Surprise on drones, as well as discriminate between different emotion intensities. We found that beyond recognition, people interpret the drone's emotions and create narratives around the drone's state. In our work, participants were further affected by the drone and displayed different responses, including empathy, depending on the valence of the drone's emotion. We conclude with design and methodological recommendations for future research into social drones.

For additional findings, we invite you to read the full paper.

Contribution in a nutshell: 

  • A set of five rendered robotic faces representing Joy, Sadness, Fear, Anger, and Surprise

  • Two user studies (N=196) showing how people recognize, interpret, and are affected by emotions on drones

  • Design recommendations for social drones using emotions and facial features

  • Methodological insights on the use of static vs. dynamic stimuli in affective robotics

bottom of page