Decoding Multisensory Attention from Electroencephalography for Use in a Brain-Computer Interface

Brain-computer interfaces (BCIs) offer a non-verbal and covert way for humans to interact with a machine. They are designed to interpret user’s brain state that can be translated into action or other communication purposes. While most previous BCI studies focus on motor imagery or steady-state evoked potential paradigms, we investigated the feasibility of a BCI system using a task-based paradigm that can potentially be integrated in a more user-friendly and engaging manner. The user was presented with multiple simultaneous, spatially separated streams of auditory and/or tactile stimuli and directed to detect a pattern in one particular stream. We applied a model-free method to decode the stream tracking effort from the EEG signal. The results showed that the proposed BCI system could capture attention from most of the participants using multisensory inputs. Successful decoding in real-time would allow the user to communicate a “control” signal to the computer via attention.

Talk slides:

Learn more about this and other talks at Microsoft Research:

(Visited 2 times, 1 visits today)

Related Videos

Comment (3)

  1. Either MS or SpaceX gets my direct consultation soon as an adviser, aside from my ongoing support of DWAVE.
    SpaceX will be developing an app & computer-quantum computer platform soon regardless.

    Until then, a tip for everybody:
    Human Logic to Machine Logic translation by the transductive logic array (nodal) of a quantum computer means
    brain to brane interface, echovocative (modal).


Your email address will not be published. Required fields are marked *