He explained that if appropriate interfaces are not designed, human operators may experience greater frustration and less trust with vehicles as they become more autonomous. The lack of transparency in the robot’s motivations can be confusing. For example, telling a user that the robot will follow them doesn’t necessarily explain how the user can expect the robot to deal with dynamic obstacles. Will the robot plan its motion in a map (which works well in static environments) or will it emphasize reactive obstacle avoidance (which works well in dynamic environments)? The human does not need to understand how the robot will reason, but the human does need to predict robot behavior at some level.
If we want humans and unmanned systems to work as a team, we need a way for them to develop shared understanding. Bruemmer explained that if we do this successfully, we provide a way for the human and robot to understand and support each other’s limitations. Roboticists often try to hide robot limitations. Bruemmer thinks they would do better to state them clearly. These views are also held by Dr. Curtis Nielsen, who works with Bruemmer as the Chief Engineer at 5D. His doctoral dissertation studied the benefit to providing a 3D visual representation that supports shared understanding. Through a collaboration that involved Scott Hartley (another researcher at 5D) and colleagues at the Idaho National Laboratory, a series of experiments indicated that people are more likely to use a low-performing robot which they can predict and understand than a high performing robot which is complex and unpredictable. Research continues at 5D to insure that the benefits of autonomous robot behavior can be achieved without the drawbacks of increased complexity, confusion and distrust.
Bruemmer described an incident in which the lack of trust hindered the utilization of unmanned vehicles operating in a radioactive environment. The vehicle refused to go through a door, which to the human operators observing by video, looked totally navigable. The human operators repeatedly attempted to manually steer the unmanned vehicle through the door, an action stubbornly resisted by the robot. This kind of “fight for control” characterizes Human Robot Interaction (HRI), when the human is not given an appropriate window into the “mind” of an autonomous robot.
Subscribe today to receive the INSIDER, a FREE e-mail newsletter from Embedded Technology featuring exclusive previews of upcoming articles, late breaking NASA and industry news, hot products and design ideas, links to online resources, and much more.