New research on hand-eye co-ordination could help prosthetic limbs feel more real
By conducting experiments using virtual reality (VR) technology, researchers found a correlation between eye-hand coordination and embodiment measures that could help prosthetic limbs function closer to a normative body.
Supplied byNew University of Alberta research could enhance eye-hand co-ordination patterns, making prosthetic limbs feel more seamless. The research is led by Craig Chapman, an associate professor in the faculty of kinesiology, sport, and recreation, and Jacqueline Hebert, a professor in the Division of Physical Medicine and Rehabilitation, in collaboration with Ewen Lavoie, a resident physician and former U of A PhD student.
By conducting experiments using virtual reality (VR) technologies, such as headsets and tracking gloves, researchers found a connection between hand-eye co-ordination and embodiment measures.
Lavoie explained that able-bodied individuals often don’t think about haptic feedback, also known as sensory feedback, in their everyday actions. However, in reality, there is a “dynamic interplay” between motor outputs and sensory inputs.
“Using that information together makes us successful in interacting with objects.”
“Someone with an upper limb prosthetic … [doesn’t] get that touch feedback from their hand, so they’re having to change their visual behaviour … because there’s nothing telling them that there’s still an object in their hand,” Lavoie said.
This demonstrates how a lack of sensory feeling impacts the eye-hand co-ordination of an individual, changing the way they go about interacting with an object, Lavoie explained.
The researchers took able-bodied individuals and designed an virtual experiment where they could create two different conditions: one where they actually hold an object and another where they only visualize it. This allowed them to test how the sensory feelings impacts eye-hand co-ordination for individuals.
The task used “[is] called the pasta box task, [where] people just move a Kraft Dinner box into some shelves and out of some shelves in a relatively standardized pattern,” Lavoie said.
Study conducted entirely in VR allowed researchers to measure people’s embodiment
Some individuals “were still seeing everything virtually, but [the researchers] overlaid a real world pasta box where a virtual pasta box was, so when they reached out to grasp the virtual pasta box, they actually felt the real box,” Lavoie added.
The other condition “didn’t have the pasta box there at all, the real world one, [but it] still had the visual of the virtual pasta box, and so they’d grasp an object and they wouldn’t feel anything as they moved it,” Lavoie explained.
He said this emulates how many prosthetic limb users function without any haptic feedback.
“What we really wanted to see was how does changing haptic feedback, on or off, affect how humans interact with objects.”
According to Lavoie, this experiment tested how embodied people felt to those virtual limbs through a subjective questionnaire where participants rank their agreement with statements like “the virtual arms felt like they were part of my body,” or “I felt like I had control over the virtual arms.”
The researchers found that when people felt the pasta box, they had higher rated levels of embodiment, which was unsurprising to them, Lavoie said.
They were surprised, however, to find “correlation between the eye-hand co-ordination measures and the embodiment measures.”
“We could see that embodiment actually will shift with people moving more towards normalized eye-hand co-ordination measures.”
If researchers are able to develop prototypes that give sensation back to prosthetic limb users, their hand-eye co-ordination patterns will improve. As a result, feelings of embodiment towards their limb will also increase, Lavoie explained.
“Hopefully, as we start to build these types of prototypes, people can function better and they feel more embodied with that limb.”



