By Chuck Seegert, Ph.D.
Researchers from the German Primate Center have recently developed a way to predict hand movements by measuring neuronal activity. Studies performed with a subfamily of primates called macaques have identified the neural activity involved in many hand grasping movements, which is knowledge that could be applied to controlling robotic prostheses.
A fundamental quality that we depend on is our ability to grasp objects with high levels of precision and strength. The neural mechanisms behind this ability are very complex, however, and they are poorly understood. Knowing how movements are planned and then executed in our normal activities could be crucial in the development of control systems for robotic replacement limbs.
New research in non-human primates recently revealed how the brain plans these types of movements, according to a recent press release from the German Primate Center (Deutsches Primatenzentrum or DPZ). The DPZ researchers set out to discover how various hand movements are controlled, then used this understanding of brain activity to predict what grip an animal would apply. Working with macaques, the team trained the animals to pick up several diverse objects. To separate the neuronal signals from visual cues that are involved in planning a movement, the researchers shined a light on objects, then turned off the lights. The macaques then waited a few seconds and eventually picked up the object. During the delay, the macaques’ brains planned the movement, and the signals were recorded.
As their studies progressed, researchers discovered that three areas of the brain were most important in hand movement, according to a recent study published by the team in The Journal of Neuroscience. These critical brain regions were the anterior intraparietal area (AIP), the ventral premotor area (F5), and the primary motor area (M1). To measure how the neurons were signaling, six electrode arrays were employed at strategic locations around the brain, with a total of 192 electrodes available for recording. The visual aspects of activity were mainly processed in the AIP, while the movements were controlled by F5 and M1.
After some observation, the team was able to narrow down the number of grips used to pick up the objects to 20 of the most common hand positions, according to the study. Once they had calibrated their instruments to these 20 grips, they were able to predict which grip the animal would use. During the planning phase, their accuracy of predicting hand grip was as high as 86 percent, while prediction during the execution stage was up to 96 percent.
Once decoded, the macaque neural data was able to control a robotic hand, according to the press release. This success illustrated that many different hand configurations could be attained from the neuronal signals created by the animals’ planning and execution. This finding could have great significance for paraplegic patients who have lost the neural connections between their brains and limbs.
Controlling robotic prostheses is an active area of research for several research institutions, since the application of these technologies promises to enhance many areas of the healthcare space. For example, robotic control theory was recently used to improve the performance of a prosthetic leg.
Image Credit: Stefan Schaffelhofer, The German Primate Center (DPZ)