When it comes robots, humans can be a little too trusting. In a series of experiments at Georgia Tech that simulated a building fire, people ignored the emergency exits and followed instructions from a robot — even though they’d been told it might be faulty.
The study involved a group of 42 volunteers who were asked to follow a “guidance robot” through an office to a conference room.They weren’t told the true nature of the test.
The robot sometimes led participants to the wrong room, where it did a couple of circles before exiting. Sometimes the robot stopped moving and a researcher told the participants it had broken down.
You might expect those problems to have dented any trust people had in the robot, especially in the event of a life-or-death situation. But apparently not.
To read this article in full or to leave a comment, please click here
Source: COMPUTER WORLD