Robot body language affects human responses

Courtney Chin Apr 6, 2009

As technology aimed at improving our lives becomes more advanced and integrated into our routines, the way we utilize machines not only needs to be simple, but also easy to interact with. One category of machinery that has been the subject of increasing amounts of research is robotics.
Bilge Mutlu, a Carnegie Mellon Ph.D. candidate in the Human-Computer Interaction Institute, has taken this research one step further to find ways in which humans can better interact with robots.

Originally trained in product design in Turkey and the United States, Mutlu also has a master’s degree from Carnegie Mellon’s School of Design.
His interest in robots stemmed from his desire to make appliances more intelligent, but he soon discovered the vast potential robots have to impact the society.
Mutlu works in collaboration with the world’s largest human-robot interaction lab — the Japan-based Advanced Telecommunications Research Institute International (ATR). Mutlu also works with the chief researchers of ATR, Takayuki Kanda and Hiroshi Ishiguro. Together, the two teams have been trying to design social mechanisms for robots and sharing information. At last month’s Human-Robot Interaction Conference in San Diego, Calif., Mutlu, along with Kanda and Ishiguro, as well as other ATR scientists, won the “Best Paper Award” for their paper “Footing in Human-Robot Conversations: How Robots Might Shape Participant Roles Using Gaze Cues.”

Mutlu has conducted several experiments to test human perceptiveness to a robot, and has found that the more human-like the robot acts, the better the interaction from the subject; he states the subjects gain “social and cognitive benefits.”
“Robots have to serve as a central interface for other forms of technology around us and offer us a social interaction paradigm to interact with them, ” Mutlu noted. “[Robots can have] a significant impact in the real world through practical applications and through scientific exploration. My work focuses on robots that embody human physical, cognitive, and social capabilities.”

To experiment with how humans can become more attentive to robots, Mutlu and his colleagues ran several tests where the robots exhibited several human idiosyncrasies. In one test, several subjects were read a story by a robot. After time, the subjets were tested for the number of details they remembered from the story. The more the robot glanced at the human, the better the recall was for details of the story from the subjects.
In another test, the subject was seated across from a robot and a dozen different objects were placed between them.
The robot mentally chose an object but did not move.

The subject’s job was to determine what object the robot had chosen by asking questions.
The researchers found that when the robot made subtle eye glances to the object, it took fewer questions for the subjects to correctly guess the right object.
When quizzed afterward about the eye glances, about 75 percent of the subjects responded that they did not even notice them.
This led Mutlu to hypothesize that the signals were subconsciously detected by the subjects.
Another interesting find was that humans generally performed better in experiments with Ishiguro’s human-like robot called Geminoid. Geminoid is an android with basic artificial intelligence capabilities programmed to act like a human.

This natural reaction, Mutlu says, is an automatic response to social clues, a phenomenon psychologists call “mindlessness,” an occurrence in which humans carry out their normal routines without realizing what they are doing.

“Robots have the potential to draw on this automatic propensity to evoke social responses from people,” he added.
Other robots that Mutlu and ATR have developed and worked with with include Honda’s ASIMO, a four-foot-tall robot that can carry out straightforward actions like flipping switches or using doorknobs, and ATR’s own Robovie, a robot that can respond to simple questions.
The newest robot Mutlu is working with is called Wakamaru, which, according to Mitsubishi Heavy Industries, is an “independent personality ... designed to live with humans.”

The ASIMO and Wakamaru robots currently reside at Carnegie Mellon, while Geminoid and Robovie are at ATR in Japan.
According to Kanda, early applications for Mutlu and ATR’s human-robot research could be creating robots that have entertainment functions or roles to physically assist people.

“So the primary goal is to establish basic communication capabilities of robots, which also needs some degree of understanding of humans about how they perceive anthropomorphized robots,” Kanda said. “It will give us various possibilities to use functions of robots. One metaphor could be a Windows system, which is to use many applications on a computer; in a robot, such general interface function will be needed, which I want to realize.”

Mutlu believes that his three overarching goals for the direction of the research are to: study more complex social behaviors in humans and design these behaviors for robots, use robots as research tools for experimental behavioral research, and develop robotic applications for the real world in such domains as education and autism therapy.