<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Norman, SL</style></author><author><style face="normal" font="default" size="100%">McFarland, DJ</style></author><author><style face="normal" font="default" size="100%">Miner, A</style></author><author><style face="normal" font="default" size="100%">Cramer, SC</style></author><author><style face="normal" font="default" size="100%">Wolbrecht, ET</style></author><author><style face="normal" font="default" size="100%">Jonathan Wolpaw</style></author><author><style face="normal" font="default" size="100%">Reinkensmeyer, DJ</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Controlling pre-movement sensorimotor rhythm can improve finger extension after stroke</style></title><secondary-title><style face="normal" font="default" size="100%">Journal of Neural Engineering</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">BCI</style></keyword><keyword><style  face="normal" font="default" size="100%">Motor control</style></keyword><keyword><style  face="normal" font="default" size="100%">Rehabilitation</style></keyword><keyword><style  face="normal" font="default" size="100%">robot</style></keyword><keyword><style  face="normal" font="default" size="100%">sensorimotor rhythm</style></keyword><keyword><style  face="normal" font="default" size="100%">Stroke</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2018</style></year><pub-dates><date><style  face="normal" font="default" size="100%">08/2018</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://stacks.iop.org/1741-2552/15/i=5/a=056026</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">15</style></volume><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">Objective. Brain–computer interface (BCI) technology is attracting increasing interest as a tool for enhancing recovery of motor function after stroke, yet the optimal way to apply this technology is unknown. Here, we studied the immediate and therapeutic effects of BCI-based training to control pre-movement sensorimotor rhythm (SMR) amplitude on robot-assisted finger extension in people with stroke. Approach. Eight people with moderate to severe hand impairment due to chronic stroke completed a four-week three-phase protocol during which they practiced finger extension with assistance from the FINGER robotic exoskeleton. In Phase 1, we identified spatiospectral SMR features for each person that correlated with the intent to extend the index and/or middle finger(s). In Phase 2, the participants learned to increase or decrease SMR features given visual feedback, without movement. In Phase 3, the participants were cued to increase or decrease their SMR features, and when successful, were then cued to immediately attempt to extend the finger(s) with robot assistance. Main results. Of the four participants that achieved SMR control in Phase 2, three initiated finger extensions with a reduced reaction time after decreasing (versus increasing) pre-movement SMR amplitude during Phase 3. Two also extended at least one of their fingers more forcefully after decreasing pre-movement SMR amplitude. Hand function, measured by the box and block test (BBT), improved by 7.3  ±  7.5 blocks versus 3.5  ±  3.1 blocks in those with and without SMR control, respectively. Higher BBT scores at baseline correlated with a larger change in BBT score. Significance. These results suggest that learning to control person-specific pre-movement SMR features associated with finger extension can improve finger extension ability after stroke for some individuals. These results merit further investigation in a rehabilitation context.</style></abstract><issue><style face="normal" font="default" size="100%">5</style></issue></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Dennis J. McFarland</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Characterizing multivariate decoding models based on correlated EEG spectral features.</style></title><secondary-title><style face="normal" font="default" size="100%">Clinical neurophysiology : official journal of the International Federation of Clinical Neurophysiology</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">brain-computer interface</style></keyword><keyword><style  face="normal" font="default" size="100%">multicollinearity</style></keyword><keyword><style  face="normal" font="default" size="100%">multivariate decoding</style></keyword><keyword><style  face="normal" font="default" size="100%">sensorimotor rhythm</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2013</style></year><pub-dates><date><style  face="normal" font="default" size="100%">07/2013</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/23466267</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">124</style></volume><pages><style face="normal" font="default" size="100%">1297–1302</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">OBJECTIVE:
Multivariate decoding methods are popular techniques for analysis of neurophysiological data. The present study explored potential interpretative problems with these techniques when predictors are correlated.
METHODS:
Data from sensorimotor rhythm-based cursor control experiments was analyzed offline with linear univariate and multivariate models. Features were derived from autoregressive (AR) spectral analysis of varying model order which produced predictors that varied in their degree of correlation (i.e., multicollinearity).
RESULTS:
The use of multivariate regression models resulted in much better prediction of target position as compared to univariate regression models. However, with lower order AR features interpretation of the spectral patterns of the weights was difficult. This is likely to be due to the high degree of multicollinearity present with lower order AR features.
CONCLUSIONS:
Care should be exercised when interpreting the pattern of weights of multivariate models with correlated predictors. Comparison with univariate statistics is advisable.
SIGNIFICANCE:
While multivariate decoding algorithms are very useful for prediction their utility for interpretation may be limited when predictors are correlated.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Friedrich, Elisabeth V. C.</style></author><author><style face="normal" font="default" size="100%">Dennis J. McFarland</style></author><author><style face="normal" font="default" size="100%">Neuper, Christa</style></author><author><style face="normal" font="default" size="100%">Theresa M Vaughan</style></author><author><style face="normal" font="default" size="100%">Peter Brunner</style></author><author><style face="normal" font="default" size="100%">Jonathan Wolpaw</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">A scanning protocol for a sensorimotor rhythm-based brain-computer interface.</style></title><secondary-title><style face="normal" font="default" size="100%">Biological psychology</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">BCI</style></keyword><keyword><style  face="normal" font="default" size="100%">brain-computer interface</style></keyword><keyword><style  face="normal" font="default" size="100%">scanning protocol</style></keyword><keyword><style  face="normal" font="default" size="100%">sensorimotor rhythm</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2009</style></year><pub-dates><date><style  face="normal" font="default" size="100%">02/2009</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/18786603</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">80</style></volume><pages><style face="normal" font="default" size="100%">169–175</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">The scanning protocol is a novel brain-computer interface (BCI) implementation that can be controlled with sensorimotor rhythms (SMRs) of the electroencephalogram (EEG). The user views a screen that shows four choices in a linear array with one marked as target. The four choices are successively highlighted for 2.5s each. When a target is highlighted, the user can select it by modulating the SMR. An advantage of this method is the capacity to choose among multiple choices with just one learned SMR modulation. Each of 10 naive users trained for ten 30 min sessions over 5 weeks. User performance improved significantly (p&lt;0.001) over the sessions and ranged from 30 to 80% mean accuracy of the last three sessions (chance accuracy=25%). The incidence of correct selections depended on the target position. These results suggest that, with further improvements, a scanning protocol can be effective. The ultimate goal is to expand it to a large matrix of selections.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Nijboer, Femke</style></author><author><style face="normal" font="default" size="100%">Adrian Furdea</style></author><author><style face="normal" font="default" size="100%">Gunst, Ingo</style></author><author><style face="normal" font="default" size="100%">Mellinger, Jürgen</style></author><author><style face="normal" font="default" size="100%">Dennis J. McFarland</style></author><author><style face="normal" font="default" size="100%">Niels Birbaumer</style></author><author><style face="normal" font="default" size="100%">Kübler, Andrea</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">An auditory brain-computer interface (BCI).</style></title><secondary-title><style face="normal" font="default" size="100%">Journal of neuroscience methods</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">auditory feedback</style></keyword><keyword><style  face="normal" font="default" size="100%">brain-computer interface</style></keyword><keyword><style  face="normal" font="default" size="100%">EEG</style></keyword><keyword><style  face="normal" font="default" size="100%">locked-in state</style></keyword><keyword><style  face="normal" font="default" size="100%">motivation</style></keyword><keyword><style  face="normal" font="default" size="100%">sensorimotor rhythm</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2008</style></year><pub-dates><date><style  face="normal" font="default" size="100%">01/2008</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/17399797</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">167</style></volume><pages><style face="normal" font="default" size="100%">43–50</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">Brain-computer interfaces (BCIs) translate brain activity into signals controlling external devices. BCIs based on visual stimuli can maintain communication in severely paralyzed patients, but only if intact vision is available. Debilitating neurological disorders however, may lead to loss of intact vision. The current study explores the feasibility of an auditory BCI. Sixteen healthy volunteers participated in three training sessions consisting of 30 2-3 min runs in which they learned to increase or decrease the amplitude of sensorimotor rhythms (SMR) of the EEG. Half of the participants were presented with visual and half with auditory feedback. Mood and motivation were assessed prior to each session. Although BCI performance in the visual feedback group was superior to the auditory feedback group there was no difference in performance at the end of the third session. Participants in the auditory feedback group learned slower, but four out of eight reached an accuracy of over 70% correct in the last session comparable to the visual feedback group. Decreasing performance of some participants in the visual feedback group is related to mood and motivation. We conclude that with sufficient training time an auditory BCI may be as efficient as a visual BCI. Mood and motivation play a role in learning to use a BCI.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Jonathan Wolpaw</style></author><author><style face="normal" font="default" size="100%">Dennis J. McFarland</style></author><author><style face="normal" font="default" size="100%">Neat, G. W.</style></author><author><style face="normal" font="default" size="100%">Forneris, C. A.</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">An EEG-based brain-computer interface for cursor control.</style></title><secondary-title><style face="normal" font="default" size="100%">Electroencephalography and clinical neurophysiology</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Communication</style></keyword><keyword><style  face="normal" font="default" size="100%">computer control</style></keyword><keyword><style  face="normal" font="default" size="100%">EEG</style></keyword><keyword><style  face="normal" font="default" size="100%">mu rhythm</style></keyword><keyword><style  face="normal" font="default" size="100%">operant conditioning</style></keyword><keyword><style  face="normal" font="default" size="100%">prosthesis</style></keyword><keyword><style  face="normal" font="default" size="100%">sensorimotor rhythm</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">1991</style></year><pub-dates><date><style  face="normal" font="default" size="100%">03/1991</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/1707798</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">78</style></volume><pages><style face="normal" font="default" size="100%">252–259</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">This study began development of a new communication and control modality for individuals with severe motor deficits. We trained normal subjects to use the 8-12 Hz mu rhythm recorded from the scalp over the central sulcus of one hemisphere to move a cursor from the center of a video screen to a target located at the top or bottom edge. Mu rhythm amplitude was assessed by on-line frequency analysis and translated into cursor movement: larger amplitudes moved the cursor up and smaller amplitudes moved it down. Over several weeks, subjects learned to change mu rhythm amplitude quickly and accurately, so that the cursor typically reached the target in 3 sec. The parameters that translated mu rhythm amplitudes into cursor movements were derived from evaluation of the distributions of amplitudes in response to top and bottom targets. The use of these distributions was a distinctive feature of this study and the key factor in its success. Refinements in training procedures and in the distribution-based method used to translate mu rhythm amplitudes into cursor movements should further improve this 1-dimensional control. Achievement of 2-dimensional control is under study. The mu rhythm may provide a significant new communication and control option for disabled individuals.</style></abstract></record></records></xml>