Towards Real-Time Classification of EEG Motor Imagery with Deep Learning

Authors

  • Aleksandrs Baskakovs Aarhus University Author
  • Luke Ring Aarhus University Author

Abstract

Brain-Computer Interfaces (BCIs) have been widely employed to identify users’ intention to control external objects by decoding motor imagery (MI) from an electroencephalogram (EEG). In recent years, the contribution of deep learning (DL) has had a phenomenal impact on MI-EEG-based BCI. Specifically, deep learning is highly attractive for MI-BCI as it requires little to no preprocessing, which results in a significant decrease in latency between a patient's intention and the execution of the command by the device, be it a prosthetic or a cursor. This study investigates the feasibility of using low-cost dry-electrode EEG recording to capture motor imagery for training neural networks to classify imagined right-hand fist clenches vs resting conditions and subsequent real-time online inference. This holds importance for many kinds of brain-computer interfaces, especially for people with impaired movement. The online aspect is optimized to minimize latency with a hard limit of 1 second from capture to classification. A complete end-to-end pipeline is provided, and although high levels of classification accuracy were not achieved, the framework sets up a clear path to implement rapid inference on consumer devices and suggests several future avenues to improve the quality and accuracy of results.

 

 

Author Biographies

  • Aleksandrs Baskakovs, Aarhus University

    Department of Linguistics, Cognitive Science and Semiotics

     

  • Luke Ring, Aarhus University

    Department of Linguistics, Cognitive Science and Semiotics

     

Downloads

Published

2023-05-15