02954nas a2200373 4500008004100000022001400041245008200055210006900137260001200206300001100218490000700229520184300236653002802079653002702107653002802134653002702162653002202189653003002211653001102241653001302252653002502265653002402290653001202314653004102326653002802367653001802395653002202413100001902435700001502454700002502469700002002494700001802514856004802532 2011 eng d a1530-888X00aA graphical model framework for decoding in the visual ERP-based BCI speller.0 agraphical model framework for decoding in the visual ERPbased BC c01/2011 a160-820 v233 a
We present a graphical model framework for decoding in the visual ERP-based speller system. The proposed framework allows researchers to build generative models from which the decoding rules are obtained in a straightforward manner. We suggest two models for generating brain signals conditioned on the stimulus events. Both models incorporate letter frequency information but assume different dependencies between brain signals and stimulus events. For both models, we derive decoding rules and perform a discriminative training. We show on real visual speller data how decoding performance improves by incorporating letter frequency information and using a more realistic graphical model for the dependencies between the brain signals and the stimulus events. Furthermore, we discuss how the standard approach to decoding can be seen as a special case of the graphical model framework. The letter also gives more insight into the discriminative approach for decoding in the visual speller system.
10aArtificial Intelligence10aComputer User Training10aDiscrimination Learning10aElectroencephalography10aEvoked Potentials10aEvoked Potentials, Visual10aHumans10aLanguage10aModels, Neurological10aModels, Theoretical10aReading10aSignal Processing, Computer-Assisted10aUser-Computer Interface10aVisual Cortex10aVisual Perception1 aMartens, S M M1 aMooij, J M1 aHill, Jeremy, Jeremy1 aFarquhar, Jason1 aSchölkopf, B uhttp://www.ncbi.nlm.nih.gov/pubmed/2096454001758nas a2200361 4500008004100000022001400041245012300055210006900178260001200247300001100259490000600270520064200276653001500918653001000933653001400943653002400957653002700981653003501008653001101043653002501054653003501079653002301114653001401137653004101151653003401192653002801226653001201254100001901266700002501285700002001310700001801330856004801348 2009 eng d a1741-255200aOverlap and refractory effects in a brain-computer interface speller based on the visual P300 event-related potential.0 aOverlap and refractory effects in a braincomputer interface spel c04/2009 a0260030 v63 aWe reveal the presence of refractory and overlap effects in the event-related potentials in visual P300 speller datasets, and we show their negative impact on the performance of the system. This finding has important implications for how to encode the letters that can be selected for communication. However, we show that such effects are dependent on stimulus parameters: an alternative stimulus type based on apparent motion suffers less from the refractory effects and leads to an improved letter prediction performance.
10aAlgorithms10aBrain10aCognition10aComputer Simulation10aElectroencephalography10aEvent-Related Potentials, P30010aHumans10aModels, Neurological10aPattern Recognition, Automated10aPhotic Stimulation10aSemantics10aSignal Processing, Computer-Assisted10aTask Performance and Analysis10aUser-Computer Interface10aWriting1 aMartens, S M M1 aHill, Jeremy, Jeremy1 aFarquhar, Jason1 aSchölkopf, B uhttp://www.ncbi.nlm.nih.gov/pubmed/19255462