Interactive robots can emotionally manipulate people, say scientists who found that we have a strong tendency to ascribe human-like attributes to machines.
Researchers from the University of Duisburg-Essen in Germany asked 89 volunteers to interact with a human-like robot under the guises of helping it become more intelligent.
At the end of the interaction, the volunteers were asked to turn off the robot. However, the robot was programmed to beg volunteers to not do so.
The robot also displayed bodily actions meant to bolster their request, 'Tech Xplore' reported.
Some volunteers served as controls - they were asked to turn off the robot but did not experience begging from the humanoid.
As many as 43 of the volunteers were confronted with the decision between complying with the researchers' request, or the robot's.
Thirteen volunteers chose to heed the robot's wishes, and all the others took longer to turn off the robot than volunteers in the control group.
The findings indicate that humans have such a strong tendency to anthropomorphize robots that we can fall prey to emotional manipulation, researchers said.
Each of the volunteers was interviewed after their interactions with the robot - those who had refused to turn off the robot were asked why.
The researchers report that many of the volunteers refused simply because the robot asked. Others reported feeling sorry for the robot or were worried about doing something wrong.
(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)