Share |

Mind over matter: Decoded EEG signals translate into grasp control

EEG plotted on scalp maps, with each map corresponding to a lag as indicated on the horizontal axis. The overall percentage contribution of each lag is shown below each scalp map. Courtesy Jose Contreras-Vidal.

It’s been a busy five years for Jose L. Contreras-Vidal, director of the laboratory for noninvasive brain-machine interface systems at the University of Houston (UH) in the US.

In 2010, his team demonstrated the feasibility of decoding 3D hand movements from scalp electroencephalography (EEG). They took it one step further this year, and have now shown that these decoded EEG representations can be translated into prosthetic hand movements. No surgery is required — an amputee just imagines a grasping movement and decoded brain signals do the rest.

Published in the April issue of Frontiers, Contreras-Vidal’s US National Science Foundation (NSF)-funded breakthrough challenges the notion that EEG is unable to provide a signal for brain machine interfaces. Contreras-Vidal, Hugh Roy and Lillie Cranz Cullen distinguished professor of electrical and computer engineering, and UH graduate students Harshavardhan Ashok Agashe, Andrew Young Paek, and Yuhang Zhang were co-authors of the Frontiers study.

“It has been thought for quite sometime that you needed to have penetrating electrodes to be able to decode enough information for the control of dexterous movements,” says Contreras-Vidal. “We have now shown this is actually possible without the risk of surgery.”

The research promises to improve our understanding of neural representations of movement during skill learning or during rehabilitation of fine motor control after a brain injury. However, designing and calibrating the neural decoders can be data intensive and computationally expensive procedures. And indeed, the longitudinal studies produce datasets running into several terabytes, Contreras-Vidal notes.

Using their lab’s small cluster, the researchers worked offline in MATLAB to create the architecture enabling grasp decoding. They devised an independent component analysis to reject unwanted artifacts from EEG data in their offline work, and a search inquiry to sift and randomize electrode input for higher decoding accuracy. To round out their computations, the team looked to artificial intelligence, creating a multiple-kernel learning algorithm to screen the EEG data and differentiate between multiple grasp types.

Volunteer demonstrates real-time closed-loop neuroprosthetic grasping control. Courtesy Jose Contreras-Vidal.

 

After finishing the computational heavy lifting, they assembled five research subjects between the ages of 20 to 28. These volunteers performed multiple grasping tasks while wearing a cap with electrodes attached to their scalp. Contreras-Vidal’s team captured their brain activity from eight cranial regions using a 64-channel active EEG. A 56-year old amputee later demonstrated proof of concept and achieved an 80% success rate over 100 trials.

To explain their continued success, Contreras-Vidal says his team is guided by the metaphor of the mind as a ‘neural symphony.’ A local neural network in a specific brain area is like a violinist who works together with other musicians to deliver the composition, he says. Whereas an invasive, surgically implanted neural interface may ‘listen’ to one or two players of the symphony, EEG can ‘listen’ to the entire orchestra. What’s more, EEG interfaces can be tuned to the right radio station playing the symphony.

Next up for the Contreras-Vidal ensemble is to run longitudinal studies to understand how stable the EEG signal remains over days and through various mental states. “We need to understand how fatigue, emotional state, attention, or even medication can affect EEG signals and therefore closed-loop neuroprosthetic control,” he says.

Live demonstration of neuroprosthetic grasping control. Courtesy Jose Contreras-Vidal.
Your rating: None Average: 4.6 (5 votes)

Comments

Post new comment

By submitting this form, you accept the Mollom privacy policy.