02954nas a2200373 4500008004100000022001400041245008200055210006900137260001200206300001100218490000700229520184300236653002802079653002702107653002802134653002702162653002202189653003002211653001102241653001302252653002502265653002402290653001202314653004102326653002802367653001802395653002202413100001902435700001502454700002502469700002002494700001802514856004802532 2011 eng d a1530-888X00aA graphical model framework for decoding in the visual ERP-based BCI speller.0 agraphical model framework for decoding in the visual ERPbased BC c01/2011 a160-820 v233 a
We present a graphical model framework for decoding in the visual ERP-based speller system. The proposed framework allows researchers to build generative models from which the decoding rules are obtained in a straightforward manner. We suggest two models for generating brain signals conditioned on the stimulus events. Both models incorporate letter frequency information but assume different dependencies between brain signals and stimulus events. For both models, we derive decoding rules and perform a discriminative training. We show on real visual speller data how decoding performance improves by incorporating letter frequency information and using a more realistic graphical model for the dependencies between the brain signals and the stimulus events. Furthermore, we discuss how the standard approach to decoding can be seen as a special case of the graphical model framework. The letter also gives more insight into the discriminative approach for decoding in the visual speller system.
10aArtificial Intelligence10aComputer User Training10aDiscrimination Learning10aElectroencephalography10aEvoked Potentials10aEvoked Potentials, Visual10aHumans10aLanguage10aModels, Neurological10aModels, Theoretical10aReading10aSignal Processing, Computer-Assisted10aUser-Computer Interface10aVisual Cortex10aVisual Perception1 aMartens, S M M1 aMooij, J M1 aHill, Jeremy, Jeremy1 aFarquhar, Jason1 aSchölkopf, B uhttp://www.ncbi.nlm.nih.gov/pubmed/2096454004388nas a2200385 4500008004100000022001400041245009500055210006900150260001200219300001100231490000800242520325500250653001003505653003403515653002103549653001003570653003603580653002403616653002703640653002103667653001103688653000903699653004103708653002803749100002603777700002503803700001403828700001903842700001403861700001503875700002503890700002103915700001803936856004803954 2011 eng d a1872-895200aTransition from the locked in to the completely locked-in state: a physiological analysis.0 aTransition from the locked in to the completely lockedin state a c06/2011 a925-330 v1223 aTo clarify the physiological and behavioral boundaries between locked-in (LIS) and the completely locked-in state (CLIS) (no voluntary eye movements, no communication possible) through electrophysiological data and to secure brain-computer-interface (BCI) communication.
Electromyography from facial muscles, external anal sphincter (EAS), electrooculography and electrocorticographic data during different psychophysiological tests were acquired to define electrophysiological differences in an amyotrophic lateral sclerosis (ALS) patient with an intracranially implanted grid of 112 electrodes for nine months while the patient passed from the LIS to the CLIS.
At the very end of the LIS there was no facial muscle activity, nor external anal sphincter but eye control. Eye movements were slow and lasted for short periods only. During CLIS event related brainpotentials (ERP) to passive limb movements and auditory stimuli were recorded, vibrotactile stimulation of different body parts resulted in no ERP response.
The results presented contradict the commonly accepted assumption that the EAS is the last remaining muscle under voluntary control and demonstrate complete loss of eye movements in CLIS. The eye muscle was shown to be the last muscle group under voluntary control. The findings suggest ALS as a multisystem disorder, even affecting afferent sensory pathways.
Auditory and proprioceptive brain-computer-interface (BCI) systems are the only remaining communication channels in CLIS.
10aAdult10aAmyotrophic Lateral Sclerosis10aArea Under Curve10aBrain10aCommunication Aids for Disabled10aDisease Progression10aElectroencephalography10aElectromyography10aHumans10aMale10aSignal Processing, Computer-Assisted10aUser-Computer Interface1 aMurguialday, Ramos, A1 aHill, Jeremy, Jeremy1 aBensch, M1 aMartens, S M M1 aHalder, S1 aNijboer, F1 aSchoelkopf, Bernhard1 aBirbaumer, Niels1 aGharabaghi, A uhttp://www.ncbi.nlm.nih.gov/pubmed/2088829201758nas a2200361 4500008004100000022001400041245012300055210006900178260001200247300001100259490000600270520064200276653001500918653001000933653001400943653002400957653002700981653003501008653001101043653002501054653003501079653002301114653001401137653004101151653003401192653002801226653001201254100001901266700002501285700002001310700001801330856004801348 2009 eng d a1741-255200aOverlap and refractory effects in a brain-computer interface speller based on the visual P300 event-related potential.0 aOverlap and refractory effects in a braincomputer interface spel c04/2009 a0260030 v63 aWe reveal the presence of refractory and overlap effects in the event-related potentials in visual P300 speller datasets, and we show their negative impact on the performance of the system. This finding has important implications for how to encode the letters that can be selected for communication. However, we show that such effects are dependent on stimulus parameters: an alternative stimulus type based on apparent motion suffers less from the refractory effects and leads to an improved letter prediction performance.
10aAlgorithms10aBrain10aCognition10aComputer Simulation10aElectroencephalography10aEvent-Related Potentials, P30010aHumans10aModels, Neurological10aPattern Recognition, Automated10aPhotic Stimulation10aSemantics10aSignal Processing, Computer-Assisted10aTask Performance and Analysis10aUser-Computer Interface10aWriting1 aMartens, S M M1 aHill, Jeremy, Jeremy1 aFarquhar, Jason1 aSchölkopf, B uhttp://www.ncbi.nlm.nih.gov/pubmed/19255462