Decoding auditory and tactile attention for use in an EEG-based brain-computer interface
- Winko W. An ,
- Hakim Si-Mohammed ,
- Nicholas Huang ,
- Hannes Gamper ,
- Adrian KC Lee ,
- Christian Holz ,
- David Johnston ,
- Mihai Jalobeanu ,
- Dimitra Emmanouilidou ,
- Ed Cutrell ,
- Andrew D. Wilson ,
- Ivan Tashev
International Winter Conference on Brain-Computer Interface |
Published by IEEE
Brain-computer interface (BCI) systems offer a nonverbal and covert way for humans to interact with a machine. They are designed to interpret a user’s brain state that can be translated into action or for other communication purposes. This study investigates the feasibility of developing a hands- and eyes-free BCI system based on auditory and tactile attention. Users were presented with multiple simultaneous streams of auditory or tactile stimuli, and were directed to detect a pattern in one particular stream. We applied a linear classifier to decode the stream-tracking attention from the EEG signal. The results showed that the proposed BCI system could capture attention from most study participants using multisensory inputs, and showed potential in transfer learning across multiple sessions.