BCI Robotic Hand Achieves New Finger-Level Milestone

College of Engineering, Carnegie Mellon University

Robotic systems have the potential to greatly enhance daily living for the over one billion individuals worldwide who experience some form of disability. Brain-computer interfaces or BCIs present a compelling option by enabling direct communication between the brain and external devices, bypassing traditional muscle-based control.

While invasive BCIs have demonstrated the ability to control robotic systems with high precision, their reliance on risky surgical implantation and ongoing maintenance restricts their use to a limited group of individuals with severe medical conditions.

Carnegie Mellon University professor Bin He has spent over two decades investigating noninvasive BCI solutions, particularly those based on electroencephalography (EEG), that are surgery-free and adaptable across a range of environments. His group has achieved a series of groundbreaking milestones using EEG-based BCIs, including the first successful flight of a drone , the first control of a robotic arm, and the first to control a robotic hand for continuous movement .

As detailed in a new study in Nature Communications, He's lab brings noninvasive EEG-based BCI one step closer to everyday use by demonstrating real-time brain decoding of individual finger movement intentions and control of a dexterous robotic hand at the finger level.

"Improving hand function is a top priority for both impaired and able-bodied individuals, as even small gains can meaningfully enhance ability and quality of life," explained Bin He, professor of biomedical engineering at Carnegie Mellon University. "However, real-time decoding of dexterous individual finger movements using noninvasive brain signals has remained an elusive goal, largely due to the limited spatial resolution of EEG."

In a first-of-its-kind achievement for EEG-based BCI, He's group employed a real-time, noninvasive robotic control system that utilized movement execution and motor imagery of individual finger movements to drive corresponding robotic finger motions. Just by thinking about it, human subjects were able to successfully perform two- and three-finger control tasks. This was accomplished with the assistance of a novel deep-learning decoding strategy and a network fine-tuning mechanism for continuous decoding from noninvasive EEG signals.

The goal moving forward is to build on this work to achieve more refined finger-level tasks, for instance, typing.

"The insights gained from this study hold immense potential to elevate the clinical relevance of noninvasive BCIs and enable applications across a broader population," He added. "Our study highlights the transformative potential of EEG-based BCIs and their application beyond basic communication to intricate motor control."

Watch a brief video about Bin He's work.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.