Humans can ‘borrow’ robot hands as their own, scientists discover

Researchers from the  Italian Institute of Technology and Brown University in the US have found that people can “sense” a robot’s hand as an extension of their body. 

They found that performing a joint task with a humanoid robot can cause a person to feel like the robot’s hand is part of their body schema.

This research could lead to better-designed robots for close human interaction, like in rehabilitation.

The body schema process

The project delved into whether the subtle, unconscious mechanisms that govern human-to-human interactions also emerge when a person interacts with a robot.

The human brain uses an unconscious process to create a “body schema” — an internal map of the body and its relationship to the surrounding space. 

This mental map helps us interact with our environment more efficiently, for instance, by allowing us to navigate around obstacles or reach for objects without thinking about it consciously. 

The brain can also incorporate tools, like a tennis racket, into this schema. The researchers wanted to determine if this process could extend to a robot, allowing it to become part of a person’s body schema.

In the study experiment, the participants worked with a child-sized humanoid robot named iCub.

They sliced a soap bar together, with the human and robot partners taking turns pulling a steel wire.

After the collaborative task, researchers used a visual test called the Posner cueing task to measure if the robot’s hand had become integrated into the participant’s body schema.

The results from 30 volunteers showed that people reacted faster when a visual cue appeared next to the robot’s hand, indicating that their brains treated it as a “near hand.”

This effect was only present in participants who had worked on the task with the robot.

Robot’s movement style also mattered

The research further revealed that the robot’s movement style significantly influenced this cognitive integration.

The effect was stronger when iCub’s gestures were broad, fluid, and well-synchronized with the human’s.

And physical proximity was a key factor; the closer the robot’s hand was to the person, the more pronounced the “near-hand effect” became.

Based on questionnaires, researchers found that the more participants who viewed the robot as more “competent and pleasant” showed a stronger cognitive bond and integration of the robot’s hand into their body schema.

“Attributing human-like traits or emotions to iCub further boosted the hand’s integration in the body schema; in other words, partnership and empathy enhanced the cognitive bond with the robot,” the researchers noted in the press release dated September 11.

The team’s controlled experiments with a humanoid robot offer new insights into human-machine interaction. 

The findings suggest that understanding psychological factors will be key to designing future robots. These robots must be intuitive, effective, and capable of adapting to human cues.  

Such advancements are critical for applying robotics in motor rehabilitation, virtual reality, and assistive technologies, where seamless and natural interaction is paramount.

The findings were published in the journal iScience.

Continue Reading