In today’s war zones, robots are commonly used to inspect and diffuse hazards or recover objects with the knowledge that the loss of a robot is a far more suitable outcome than the death of a solider. However, as robots become valued members of the team, there’s a possibility to acknowledge them like coworkers rather than machines.
Professor Mark Billinghurs, teacher of Human-Computer Interaction at the University of South Australia, has cooperated with Dr. James Wen and other members of the United States Air Force Academy (USAFA) to investigate these connections and their effects on team productivity and efficiency on the front-line.
Their study shows that for robots to be entirely incorporated within a human-machine team (HMT), first they must be recognized as teammates. To enable this, a lot of effort has been done over a period to make robots more ‘human-like’ by changing their physical appearances and abilities.
While civilizing robots fortifies the working relationships between soldiers and their robots, it also expands the worth of the robot team members in the minds of military personnel, resulting in an elevated emotional reaction when the robot is put under stress.
Using a simulation-based application, the researchers observed the emotional responses of two teams of members who accepted a range of simulated tasks with either a personified or non-personified robot.
The research study indicated that teams functioning with a personified robot were 12 percent less likely to put their robot at risk of destruction compared to teams working with a non-personified robot. Moreover, the teams working with personified robots were more sensitive to the robot’s health and the likelihood of seeing the robot ‘killed’ in action.
This is the first time that research has measured how actions can be changed by empathy when possible harm is induced in a simulation.
Prof Billinghurst says the outcomes show directly how emotional links can influence decision making in the field.
“We have evidence to show that teams working with a personified robot are considerably more mindful about limiting damage and harm towards it—but this can have significant consequences,”Prof Billinghurst
“Participants who restricted their use of robots or chose not the use the robots had the same overall achievement to the teams that did, the effect of an increased level of self-sacrifice in the shape of working harder to gain the same result.”
For the majority of us, a sensitive attachment to a robot is considered harmless. Sharing a bond with your Google Home speaker or Roomba vacuum can be calming and pleasurable, but empathy shown by a solider towards a military robot has the ability to affect the performance on the front-line.
Prof Billinghurst said that instead of using the robot, members who work with a personified robot had to increase their amount of work and were ready to take more personal risks and would stop before putting the robot in danger- affecting their decision making under pressure.
YOU MAY LIKE: Autonomous Killer Robots Are Here To Stay
“Such hesitation and having an empathic response in these situations could have risky consequences for military personnel,” he added.
Where split-second decisions can decide the difference between life and death, it will become more and more important to keep an eye on soldiers working in association with robots.
It is anticipated that military robots will be used frequently in the future requiring further research, preparation, and evaluation on the topic.
The research also has suggestions for an extensive range of other human-robot cooperative tasks in non-military settings, such as in hospitals, on the factory floor, or even in the home.