The Softer Side of Robots
This research raises the prospect that robots, which are already used to teach second languages, could recognize when students are bored.
Robots could be taught to recognize human emotions from our movements, according to a new study from the Behavioural Science Group at Warwick Business School in the UK. The WBS Behavioural Science Group is one of the world’s leading research centers in the field, with the goal of linking theoretical and policy challenges in the social sciences with experimental methods and results drawn from the natural sciences.
Researchers found that humans could recognize excitement, sadness, aggression, and boredom from the way people moved, even if they could not see their facial expressions or hear their voices. These findings suggest that robots could learn to use the same movements, alongside facial expressions and tone of voice, to recognize human internal states.
This research raises the prospect that robots, which are already used to teach second languages, could recognize when students are bored, and customer service robots could identify when people feel angry or stressed. “One of the main goals in the field of human-robot interaction is to create machines that can recognize human emotions and respond accordingly,” says Dr. Charlotte Edmunds of Warwick Business School.
“Our results suggest it is reasonable to expect a machine learning algorithm, and consequently a robot, to recognize a range of emotions and social interactions using movements, poses, and facial expressions. The potential applications are huge.”
The study was conducted by researchers from Warwick Business School, University of Plymouth, Donders Centre for Cognition at Radboud University in the Netherlands, and the Bristol Robotics Lab at the University of the West of England. It is published in the journal Frontiers in Robotics and AI.
The team of psychologists and computer scientists filmed pairs of children playing with a robot and a computer built into a table with a touchscreen top. The videos were shown to 284 study participants, who were asked to decide whether the children were excited, bored, or sad. They were also asked if the children were cooperating, competing, or if one of the children had assumed a dominant role in the relationship.
Some participants watched the original videos. A second group saw the footage reduced to stick figures that showed exactly the same movements. Members of both groups agreed on the same emotional labels for the children, more often than would be expected if they were guessing.
The researchers then trained a machine-learning algorithm to label the clips, identifying the type of social interaction, the emotions on display, and the strength of each child’s internal state, allowing it to compare which child felt more sad or excited.
“Robot delivery services are already being trialed, but people tend to attack or vandalize them, often because they feel threatened,” says Dr. Edmunds.
“The aim is to create a robot that can react to human emotions in difficult situations and get itself out of trouble without having to be monitored or told what to do.”
Tech & Learning Newsletter
Tools and ideas to transform education. Sign up below.