Advancing Brain–Computer Interfaces for Rehabilitation and Assistive Technologies
2026.03.10
Research
Motor imagery or imagined limb movements can power brain–computer interface (BCI) devices, such as prostheses and wheelchairs, supporting rehabilitation for people with neuromusculoskeletal disorders. However, conventional decoding methods often fail to capture complex spatiotemporal variations in electroencephalography signals. Now, researchers from Chiba University, Japan, have developed a novel Embedding-Driven Graph Convolutional Network that decodes dynamic patterns in brain activity, offering improved adaptability and generalizability to advance next-generation BCI technologies.
-
Motor imagery electroencephalography (EEG) signals depict changes in brain activity during imagined limb movements. Conventional methods, however, often fail to capture these spatiotemporal variations. Researchers from Chiba University have developed a novel Embedding-Driven Graph Convolutional Network that can decode the spatiotemporal heterogeneity in EEG signals, advancing brain-computer interface technologies.