We had a little laugh and talked about it, then searched for the video online after lunch. I found myself caught between two feelings: I was amused by the idea of the video but at the same time felt bad for the little dinosaur robot with the heartbreaking cries. That was when we decided to do some research on this topic.
Astrid Rosenthal-von der Pütten
Robots are slowly extending their influence across all areas of our lives, as toys in the form of Furbies, as automated vacuum cleaners, or as devices to assist elderly or disabled people, for example. How humans and robots interact with each other is a key aspect that roboticists must consider, and one of the goals of robotics research is develop machines that can fill the role of companions, forming long-term relationships with their human users.
Although users often engage easily and enthusiastically at first, many studies point to the fact that, beyond the initial period of novelty, it is difficult to maintain interest in continuous usage of a robot. However, research from the University of Duisburg-Essen indicates that humans really do feel for robots on an emotional level and this finding could prove useful in developing robots capable of truly helping humans.
As social psychologists Astrid Rosenthal-von der Pütten and her colleagues are interested in the social effects of artificial entities, such as embodied conversational agents or robots. As she explained to ScienceOmega.com
, this particular topic came to their attention by chance.
"One lunchtime my colleague Frank P. Schulte mentioned a video he had seen of some guys trying to answer the question, ‘How much punishment can a robot dinosaur take?’," said Rosenthal-von der Pütten, a research associate in the Department of Social Psychology at Duisburg-Essen. "We had a little laugh and talked about it, then searched for the video online after lunch. I found myself caught between two feelings: I was amused by the idea of the video but at the same time felt bad for the little dinosaur robot with the heartbreaking cries. That was when we decided to do some research on this topic."
An unintuitive response?
Research on emotions is not always easy from a methodological point of view. People may have problems verbalising subtle social aspects of interactions or their concrete emotional state, and thus objective measures are widely used to investigate emotion.
"People might find it strange to report on their emotions in human-robot interactions because it seems unintuitive that one would experience emotions towards an object," explained Rosenthal-von der Pütten. "We utilised more objective measures linked to emotion, namely physiological arousal and brain activity associated with emotional processing."
In the first of two studies carried out by Rosenthal-von der Pütten and colleagues including Professors Nicole Krämer and Matthias Brand, participants’ physiological arousal was measured as they watched videos of a robotic dinosaur toy being treated affectionately and violently. Afterwards the 40 volunteers were asked to report their emotional state. Higher levels of arousal and more negative feelings were recorded after the violent video.
The second study, in collaboration with the Erwin L. Hahn Institute for Magnetic Resonance Imaging, used functional magnetic resonance imaging (fMRI) to monitor brain activity in 14 participants. Again, affectionate and abusive conditions were illustrated, but this time the videos showed a robot, a human and an inanimate object being subjected to this treatment.
"We did not find large differences in brain activation when comparing the human and robot stimuli," Rosenthal-von der Pütten commented. "Even though we assumed that the robot stimuli would trigger emotional processing, we expected these processes to be considerably weaker than for human stimuli. It seems that both stimuli undergo the same emotional processing.
"Separate analyses of the response to the violent videos and the affectionate videos showed differences with regard to the violent videos. We found more brain activity in areas associated with emotional processing when participants observed the human being abused in contrast to the robot being abused."
Building long-term relationships
Since this study only addressed the immediate reaction to stimuli, it will take further investigations to find out what happens after this short-term information processing. According to Rosenthal-von der Pütten, the fact that our brains react similarly to stimuli treating humans and robots affectionately does not necessarily mean that we are bound to experience affection for robots.
"In previous research we observed that during a long-term interaction with an assistive robot, some participants showed signs of bonding and relationship-building with the robot, while others treated it like a piece of technology," she continued. "There are still other, higher cognitive, processes involved that might or might not lead to relationship-building in any way, shape or form."
The knowledge gained from this and related studies will have implications for the usage of robots in everyday life.
"When we know that on a neural basis people react similarly to robots and fellow humans we might either be more careful in employing robots or might exploit this in areas where interactions with robots are mandatory," Rosenthal-von der Pütten stated.
The researchers will present the work at the International Communication Association’s annual conference, which is taking place in London between 16-21 June. In the meantime, Rosenthal-von der Pütten and her co-authors are considering the limitations of the study and how future developments in robotics will reflect their ability to conduct research on the social effects of robots.
"Pleo is priced at around 300 euros, so the possibility of it breaking during video recording was a financial risk we could cope with," she said. "It would certainly be interesting to include mechanical, humanoid or even android robots, but they are expensive and the risk of damage is very high in the kind of violent interactions that we staged.
"The general trend, however, is that humanoid platforms – like the NAO robot from Aldebaran Robotics – are becoming more available and affordable, and so we are looking forward to studying robots with other appearances. Moreover, there is the possibility of changing the paradigm and working with ‘social pain’, like ostracism or verbal abuse."