%0 Journal Article %J Journal of Neural Engineering %D 2015 %T Identifying the Attended Speaker Using Electrocorticographic (ECoG) Signals. %A Dijkstra, K. %A Peter Brunner %A Gunduz, Aysegul %A Coon, W.G. %A A L Ritaccio %A Farquhar, Jason %A Gerwin Schalk %K auditory attention %K Brain-computer interface (BCI) %K Cocktail Party %K electrocorticography (ECoG) %X People affected by severe neuro-degenerative diseases (e.g., late-stage amyotrophic lateral sclerosis (ALS) or locked-in syndrome) eventually lose all muscular control. Thus, they cannot use traditional assistive communication devices that depend on muscle control, or brain-computer interfaces (BCIs) that depend on the ability to control gaze. While auditory and tactile BCIs can provide communication to such individuals, their use typically entails an artificial mapping between the stimulus and the communication intent. This makes these BCIs difficult to learn and use. In this study, we investigated the use of selective auditory attention to natural speech as an avenue for BCI communication. In this approach, the user communicates by directing his/her attention to one of two simultaneously presented speakers. We used electrocorticographic (ECoG) signals in the gamma band (70–170 Hz) to infer the identity of attended speaker, thereby removing the need to learn such an artificial mapping. Our results from twelve human subjects show that a single cortical location over superior temporal gyrus or pre-motor cortex is typically sufficient to identify the attended speaker within 10 s and with 77% accuracy (50% accuracy due to chance). These results lay the groundwork for future studies that may determine the real-time performance of BCIs based on selective auditory attention to speech. %B Journal of Neural Engineering %G eng %U https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4776341/ %R 10.1080/2326263X.2015.1063363 %0 Journal Article %J Frontiers in Neuroscience %D 2012 %T Communication and control by listening: towards optimal design of a two-class auditory streaming brain-computer interface. %A Jeremy Jeremy Hill %A Moinuddin, Aisha %A Häuser, Ann-Katrin %A Kienzle, Stephan %A Gerwin Schalk %K auditory attention %K auditory event-related potentials %K brain-computer interface %K dichotic listening %K N1 potential %K P3 potential %X Most brain-computer interface (BCI) systems require users to modulate brain signals in response to visual stimuli. Thus, they may not be useful to people with limited vision, such as those with severe paralysis. One important approach for overcoming this issue is auditory streaming, an approach whereby a BCI system is driven by shifts of attention between two simultaneously presented auditory stimulus streams. Motivated by the long-term goal of translating such a system into a reliable, simple yes-no interface for clinical usage, we aim to answer two main questions. First, we asked which of two previously published variants provides superior performance: a fixed-phase (FP) design in which the streams have equal period and opposite phase, or a drifting-phase (DP) design where the periods are unequal. We found FP to be superior to DP (p = 0.002): average performance levels were 80 and 72% correct, respectively. We were also able to show, in a pilot with one subject, that auditory streaming can support continuous control and neurofeedback applications: by shifting attention between ongoing left and right auditory streams, the subject was able to control the position of a paddle in a computer game. Second, we examined whether the system is dependent on eye movements, since it is known that eye movements and auditory attention may influence each other, and any dependence on the ability to move one’s eyes would be a barrier to translation to paralyzed users. We discovered that, despite instructions, some subjects did make eye movements that were indicative of the direction of attention. However, there was no correlation, across subjects, between the reliability of the eye movement signal and the reliability of the BCI system, indicating that our system was configured to work independently of eye movement. Together, these findings are an encouraging step forward toward BCIs that provide practical communication and control options for the most severely paralyzed users. %B Frontiers in Neuroscience %V 6 %8 12/2012 %G eng %U http://www.ncbi.nlm.nih.gov/pubmed/23267312 %R 10.3389/fnins.2012.00181