Brain-computer interfaces (BCIs) offer a non-verbal and covert way for humans to interact with a machine. They are designed to interpret user’s brain state that can be translated into action or other communication purposes. While most previous BCI studies focus on motor imagery or steady-state evoked potential paradigms, we investigated the feasibility of a BCI system using a task-based paradigm that can potentially be integrated in a more user-friendly and engaging manner. The user was presented with multiple simultaneous, spatially separated streams of auditory and/or tactile stimuli and directed to detect a pattern in one particular stream. We applied a model-free method to decode the stream tracking effort from the EEG signal. The results showed that the proposed BCI system could capture attention from most of the participants using multisensory inputs. Successful decoding in real-time would allow the user to communicate a “control” signal to the computer via attention.
Talk slides: https://www.microsoft.com/en-us/research/uploads/prod/2019/09/Decoding-Multisensory-Attention-from-Electroencephalography-for-Use-in-a-Brain-Computer-Interface-SLIDES.pdf
Learn more about this and other talks at Microsoft Research: https://www.microsoft.com/en-us/research/video/decoding-multisensory-attention-from-electroencephalography-for-use-in-a-brain-computer-interface/