<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Cao, Runnan</style></author><author><style face="normal" font="default" size="100%">Brunner, Peter</style></author><author><style face="normal" font="default" size="100%">Brandmeir, Nicholas J</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Wang, Shuo</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">A human single-neuron dataset for object recognition.</style></title><secondary-title><style face="normal" font="default" size="100%">Sci Data</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Sci Data</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Amygdala</style></keyword><keyword><style  face="normal" font="default" size="100%">Epilepsy</style></keyword><keyword><style  face="normal" font="default" size="100%">Hippocampus</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Neurons</style></keyword><keyword><style  face="normal" font="default" size="100%">Pattern Recognition, Visual</style></keyword><keyword><style  face="normal" font="default" size="100%">Recognition, Psychology</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2025</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2025 Jan 15</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">12</style></volume><pages><style face="normal" font="default" size="100%">79</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Object recognition is fundamental to how we interact with and interpret the world around us. The human amygdala and hippocampus play a key role in object recognition, contributing to both the encoding and retrieval of visual information. Here, we recorded single-neuron activity from the human amygdala and hippocampus when neurosurgical epilepsy patients performed a one-back task using naturalistic object stimuli. We employed two sets of naturalistic object images from leading datasets extensively used in primate neural recordings and computer vision models: we recorded 1204 neurons using the ImageNet stimuli, which included broader object categories (10 different images per category for 50 categories), and we recorded 512 neurons using the Microsoft COCO stimuli, which featured a higher number of images per category (50 different images per category for 10 categories). Together, our extensive dataset, offering the highest spatial and temporal resolution currently available in humans, will not only facilitate a comprehensive analysis of the neural correlates of object recognition but also provide valuable opportunities for training and validating computational models.&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">1</style></issue></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Tan, Gansheng</style></author><author><style face="normal" font="default" size="100%">Adams, Josh</style></author><author><style face="normal" font="default" size="100%">Donovan, Kara</style></author><author><style face="normal" font="default" size="100%">Demarest, Phillip</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Brunner, Peter</style></author><author><style face="normal" font="default" size="100%">Gorlewicz, Jenna L</style></author><author><style face="normal" font="default" size="100%">Leuthardt, Eric C</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Does vibrotactile stimulation of the auricular vagus nerve enhance working memory? A behavioral and physiological investigation.</style></title><secondary-title><style face="normal" font="default" size="100%">Brain Stimul</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Brain Stimul</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Adult</style></keyword><keyword><style  face="normal" font="default" size="100%">Female</style></keyword><keyword><style  face="normal" font="default" size="100%">Galvanic Skin Response</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Male</style></keyword><keyword><style  face="normal" font="default" size="100%">Memory, Short-Term</style></keyword><keyword><style  face="normal" font="default" size="100%">Pupil</style></keyword><keyword><style  face="normal" font="default" size="100%">Vagus Nerve</style></keyword><keyword><style  face="normal" font="default" size="100%">Vagus Nerve Stimulation</style></keyword><keyword><style  face="normal" font="default" size="100%">Vibration</style></keyword><keyword><style  face="normal" font="default" size="100%">Young Adult</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2024</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2024 Mar-Apr</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">17</style></volume><pages><style face="normal" font="default" size="100%">460-468</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;&lt;b&gt;BACKGROUND: &lt;/b&gt;Working memory is essential to a wide range of cognitive functions and activities. Transcutaneous auricular vagus nerve stimulation (taVNS) is a promising method to improve working memory performance. However, the feasibility and scalability of electrical stimulation are constrained by several limitations, such as auricular discomfort and inconsistent electrical contact.&lt;/p&gt;&lt;p&gt;&lt;b&gt;OBJECTIVE: &lt;/b&gt;We aimed to develop a novel and practical method, vibrotactile taVNS, to improve working memory. Further, we investigated its effects on arousal, measured by skin conductance and pupil diameter.&lt;/p&gt;&lt;p&gt;&lt;b&gt;METHOD: &lt;/b&gt;This study included 20 healthy participants. Behavioral response, skin conductance, and eye tracking data were concurrently recorded while the participants performed N-back tasks under three conditions: vibrotactile taVNS delivered to the cymba concha, earlobe (sham control), and no stimulation (baseline control).&lt;/p&gt;&lt;p&gt;&lt;b&gt;RESULTS: &lt;/b&gt;In 4-back tasks, which demand maximal working memory capacity, active vibrotactile taVNS significantly improved the performance metric d compared to the baseline but not to the sham. Moreover, we found that the reduction rate of d with increasing task difficulty was significantly smaller during vibrotactile taVNS sessions than in both baseline and sham conditions. Arousal, measured as skin conductance and pupil diameter, declined over the course of the tasks. Vibrotactile taVNS rescued this arousal decline, leading to arousal levels corresponding to optimal working memory levels. Moreover, pupil diameter and skin conductance level were higher during high-cognitive-load tasks when vibrotactile taVNS was delivered to the concha compared to baseline and sham.&lt;/p&gt;&lt;p&gt;&lt;b&gt;CONCLUSION: &lt;/b&gt;Our findings suggest that vibrotactile taVNS modulates the arousal pathway and could be a potential intervention for enhancing working memory.&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">2</style></issue></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Xie, Tao</style></author><author><style face="normal" font="default" size="100%">Adamek, Markus</style></author><author><style face="normal" font="default" size="100%">Cho, Hohyun</style></author><author><style face="normal" font="default" size="100%">Adamo, Matthew A</style></author><author><style face="normal" font="default" size="100%">Ritaccio, Anthony L</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Brunner, Peter</style></author><author><style face="normal" font="default" size="100%">Kubanek, Jan</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Graded decisions in the human brain.</style></title><secondary-title><style face="normal" font="default" size="100%">Nat Commun</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Nat Commun</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Adult</style></keyword><keyword><style  face="normal" font="default" size="100%">Brain</style></keyword><keyword><style  face="normal" font="default" size="100%">Choice Behavior</style></keyword><keyword><style  face="normal" font="default" size="100%">Decision Making</style></keyword><keyword><style  face="normal" font="default" size="100%">Female</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Male</style></keyword><keyword><style  face="normal" font="default" size="100%">Parietal Lobe</style></keyword><keyword><style  face="normal" font="default" size="100%">Uncertainty</style></keyword><keyword><style  face="normal" font="default" size="100%">Young Adult</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2024</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2024 May 21</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">15</style></volume><pages><style face="normal" font="default" size="100%">4308</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Decision-makers objectively commit to a definitive choice, yet at the subjective level, human decisions appear to be associated with a degree of uncertainty. Whether decisions are definitive (i.e., concluding in all-or-none choices), or whether the underlying representations are graded, remains unclear. To answer this question, we recorded intracranial neural signals directly from the brain while human subjects made perceptual decisions. The recordings revealed that broadband gamma activity reflecting each individual's decision-making process, ramped up gradually while being graded by the accumulated decision evidence. Crucially, this grading effect persisted throughout the decision process without ever reaching a definite bound at the time of choice. This effect was most prominent in the parietal cortex, a brain region traditionally implicated in decision-making. These results provide neural evidence for a graded decision process in humans and an analog framework for flexible choice behavior.&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">1</style></issue></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Blanpain, Lou T</style></author><author><style face="normal" font="default" size="100%">Cole, Eric R</style></author><author><style face="normal" font="default" size="100%">Chen, Emily</style></author><author><style face="normal" font="default" size="100%">Park, James K</style></author><author><style face="normal" font="default" size="100%">Walelign, Michael Y</style></author><author><style face="normal" font="default" size="100%">Gross, Robert E</style></author><author><style face="normal" font="default" size="100%">Cabaniss, Brian T</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Singer, Annabelle C</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Multisensory flicker modulates widespread brain networks and reduces interictal epileptiform discharges.</style></title><secondary-title><style face="normal" font="default" size="100%">Nat Commun</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Nat Commun</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Brain</style></keyword><keyword><style  face="normal" font="default" size="100%">Cross-Over Studies</style></keyword><keyword><style  face="normal" font="default" size="100%">Electroencephalography</style></keyword><keyword><style  face="normal" font="default" size="100%">Epilepsies, Partial</style></keyword><keyword><style  face="normal" font="default" size="100%">Epilepsy</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Temporal Lobe</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2024</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2024 Apr 11</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">15</style></volume><pages><style face="normal" font="default" size="100%">3156</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Modulating brain oscillations has strong therapeutic potential. Interventions that both non-invasively modulate deep brain structures and are practical for chronic daily home use are desirable for a variety of therapeutic applications. Repetitive audio-visual stimulation, or sensory flicker, is an accessible approach that modulates hippocampus in mice, but its effects in humans are poorly defined. We therefore quantified the neurophysiological effects of flicker with high spatiotemporal resolution in patients with focal epilepsy who underwent intracranial seizure monitoring. In this interventional trial (NCT04188834) with a cross-over design, subjects underwent different frequencies of flicker stimulation in the same recording session with the effect of sensory flicker exposure on local field potential (LFP) power and interictal epileptiform discharges (IEDs) as primary and secondary outcomes, respectively. Flicker focally modulated local field potentials in expected canonical sensory cortices but also in the medial temporal lobe and prefrontal cortex, likely via resonance of stimulated long-range circuits. Moreover, flicker decreased interictal epileptiform discharges, a pathological biomarker of epilepsy and degenerative diseases, most strongly in regions where potentials were flicker-modulated, especially the visual cortex and medial temporal lobe. This trial met the scientific goal and is now closed. Our findings reveal how multi-sensory stimulation may modulate cortical structures to mitigate pathological activity in humans.&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">1</style></issue></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Cao, Runnan</style></author><author><style face="normal" font="default" size="100%">Wang, Jinge</style></author><author><style face="normal" font="default" size="100%">Brunner, Peter</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Li, Xin</style></author><author><style face="normal" font="default" size="100%">Rutishauser, Ueli</style></author><author><style face="normal" font="default" size="100%">Brandmeir, Nicholas J</style></author><author><style face="normal" font="default" size="100%">Wang, Shuo</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Neural mechanisms of face familiarity and learning in the human amygdala and hippocampus.</style></title><secondary-title><style face="normal" font="default" size="100%">Cell Rep</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Cell Rep</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Amygdala</style></keyword><keyword><style  face="normal" font="default" size="100%">Facial Recognition</style></keyword><keyword><style  face="normal" font="default" size="100%">Hippocampus</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Learning</style></keyword><keyword><style  face="normal" font="default" size="100%">Pattern Recognition, Visual</style></keyword><keyword><style  face="normal" font="default" size="100%">Recognition, Psychology</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2024</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2024 Jan 23</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">43</style></volume><pages><style face="normal" font="default" size="100%">113520</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Recognizing familiar faces and learning new faces play an important role in social cognition. However, the underlying neural computational mechanisms remain unclear. Here, we record from single neurons in the human amygdala and hippocampus and find a greater neuronal representational distance between pairs of familiar faces than unfamiliar faces, suggesting that neural representations for familiar faces are more distinct. Representational distance increases with exposures to the same identity, suggesting that neural face representations are sharpened with learning and familiarization. Furthermore, representational distance is positively correlated with visual dissimilarity between faces, and exposure to visually similar faces increases representational distance, thus sharpening neural representations. Finally, we construct a computational model that demonstrates an increase in the representational distance of artificial units with training. Together, our results suggest that the neuronal population geometry, quantified by the representational distance, encodes face familiarity, similarity, and learning, forming the basis of face recognition and memory.&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">1</style></issue></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Cho, Hohyun</style></author><author><style face="normal" font="default" size="100%">Adamek, Markus</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Brunner, Peter</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Novel cyclic homogeneous oscillation detection method for high accuracy and specific characterization of neural dynamics.</style></title><secondary-title><style face="normal" font="default" size="100%">Elife</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Elife</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Brain</style></keyword><keyword><style  face="normal" font="default" size="100%">Electrocorticography</style></keyword><keyword><style  face="normal" font="default" size="100%">Electroencephalography</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Signal Processing, Computer-Assisted</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2024</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2024 Sep 06</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">12</style></volume><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Determining the presence and frequency of neural oscillations is essential to understanding dynamic brain function. Traditional methods that detect peaks over 1/ noise within the power spectrum fail to distinguish between the fundamental frequency and harmonics of often highly non-sinusoidal neural oscillations. To overcome this limitation, we define fundamental criteria that characterize neural oscillations and introduce the cyclic homogeneous oscillation (CHO) detection method. We implemented these criteria based on an autocorrelation approach to determine an oscillation's fundamental frequency. We evaluated CHO by verifying its performance on simulated non-sinusoidal oscillatory bursts and validated its ability to determine the fundamental frequency of neural oscillations in electrocorticographic (ECoG), electroencephalographic (EEG), and stereoelectroencephalographic (SEEG) signals recorded from 27 human subjects. Our results demonstrate that CHO outperforms conventional techniques in accurately detecting oscillations. In summary, CHO demonstrates high precision and specificity in detecting neural oscillations in time and frequency domains. The method's specificity enables the detailed study of non-sinusoidal characteristics of oscillations, such as the degree of asymmetry and waveform of an oscillation. Furthermore, CHO can be applied to identify how neural oscillations govern interactions throughout the brain and to determine oscillatory biomarkers that index abnormal brain function.&lt;/p&gt;</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Gordon, Evan M</style></author><author><style face="normal" font="default" size="100%">Chauvin, Roselyne J</style></author><author><style face="normal" font="default" size="100%">Van, Andrew N</style></author><author><style face="normal" font="default" size="100%">Rajesh, Aishwarya</style></author><author><style face="normal" font="default" size="100%">Nielsen, Ashley</style></author><author><style face="normal" font="default" size="100%">Newbold, Dillan J</style></author><author><style face="normal" font="default" size="100%">Lynch, Charles J</style></author><author><style face="normal" font="default" size="100%">Seider, Nicole A</style></author><author><style face="normal" font="default" size="100%">Krimmel, Samuel R</style></author><author><style face="normal" font="default" size="100%">Scheidter, Kristen M</style></author><author><style face="normal" font="default" size="100%">Monk, Julia</style></author><author><style face="normal" font="default" size="100%">Miller, Ryland L</style></author><author><style face="normal" font="default" size="100%">Metoki, Athanasia</style></author><author><style face="normal" font="default" size="100%">Montez, David F</style></author><author><style face="normal" font="default" size="100%">Zheng, Annie</style></author><author><style face="normal" font="default" size="100%">Elbau, Immanuel</style></author><author><style face="normal" font="default" size="100%">Madison, Thomas</style></author><author><style face="normal" font="default" size="100%">Nishino, Tomoyuki</style></author><author><style face="normal" font="default" size="100%">Myers, Michael J</style></author><author><style face="normal" font="default" size="100%">Kaplan, Sydney</style></author><author><style face="normal" font="default" size="100%">Badke D'Andrea, Carolina</style></author><author><style face="normal" font="default" size="100%">Demeter, Damion V</style></author><author><style face="normal" font="default" size="100%">Feigelis, Matthew</style></author><author><style face="normal" font="default" size="100%">Ramirez, Julian S B</style></author><author><style face="normal" font="default" size="100%">Xu, Ting</style></author><author><style face="normal" font="default" size="100%">Barch, Deanna M</style></author><author><style face="normal" font="default" size="100%">Smyser, Christopher D</style></author><author><style face="normal" font="default" size="100%">Rogers, Cynthia E</style></author><author><style face="normal" font="default" size="100%">Zimmermann, Jan</style></author><author><style face="normal" font="default" size="100%">Botteron, Kelly N</style></author><author><style face="normal" font="default" size="100%">Pruett, John R</style></author><author><style face="normal" font="default" size="100%">Willie, Jon T</style></author><author><style face="normal" font="default" size="100%">Brunner, Peter</style></author><author><style face="normal" font="default" size="100%">Shimony, Joshua S</style></author><author><style face="normal" font="default" size="100%">Kay, Benjamin P</style></author><author><style face="normal" font="default" size="100%">Marek, Scott</style></author><author><style face="normal" font="default" size="100%">Norris, Scott A</style></author><author><style face="normal" font="default" size="100%">Gratton, Caterina</style></author><author><style face="normal" font="default" size="100%">Sylvester, Chad M</style></author><author><style face="normal" font="default" size="100%">Power, Jonathan D</style></author><author><style face="normal" font="default" size="100%">Liston, Conor</style></author><author><style face="normal" font="default" size="100%">Greene, Deanna J</style></author><author><style face="normal" font="default" size="100%">Roland, Jarod L</style></author><author><style face="normal" font="default" size="100%">Petersen, Steven E</style></author><author><style face="normal" font="default" size="100%">Raichle, Marcus E</style></author><author><style face="normal" font="default" size="100%">Laumann, Timothy O</style></author><author><style face="normal" font="default" size="100%">Fair, Damien A</style></author><author><style face="normal" font="default" size="100%">Dosenbach, Nico U F</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">A somato-cognitive action network alternates with effector regions in motor cortex.</style></title><secondary-title><style face="normal" font="default" size="100%">Nature</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Nature</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Animals</style></keyword><keyword><style  face="normal" font="default" size="100%">Brain Mapping</style></keyword><keyword><style  face="normal" font="default" size="100%">Child</style></keyword><keyword><style  face="normal" font="default" size="100%">Cognition</style></keyword><keyword><style  face="normal" font="default" size="100%">Datasets as Topic</style></keyword><keyword><style  face="normal" font="default" size="100%">Foot</style></keyword><keyword><style  face="normal" font="default" size="100%">Hand</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Infant</style></keyword><keyword><style  face="normal" font="default" size="100%">Infant, Newborn</style></keyword><keyword><style  face="normal" font="default" size="100%">Macaca</style></keyword><keyword><style  face="normal" font="default" size="100%">Magnetic Resonance Imaging</style></keyword><keyword><style  face="normal" font="default" size="100%">Motor Cortex</style></keyword><keyword><style  face="normal" font="default" size="100%">Mouth</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2023</style></year><pub-dates><date><style  face="normal" font="default" size="100%">05/2023</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">617</style></volume><pages><style face="normal" font="default" size="100%">351-359</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Motor cortex (M1) has been thought to form a continuous somatotopic homunculus extending down the precentral gyrus from foot to face representations, despite evidence for concentric functional zones and maps of complex actions. Here, using precision functional magnetic resonance imaging (fMRI) methods, we find that the classic homunculus is interrupted by regions with distinct connectivity, structure and function, alternating with effector-specific (foot, hand and mouth) areas. These inter-effector regions exhibit decreased cortical thickness and strong functional connectivity to each other, as well as to the cingulo-opercular network (CON), critical for action and physiological control, arousal, errors and pain. This interdigitation of action control-linked and motor effector regions was verified in the three largest fMRI datasets. Macaque and pediatric (newborn, infant and child) precision fMRI suggested cross-species homologues and developmental precursors of the inter-effector system. A battery of motor and action fMRI tasks documented concentric effector somatotopies, separated by the CON-linked inter-effector regions. The inter-effectors lacked movement specificity and co-activated during action planning (coordination of hands and feet) and axial body movement (such as of the abdomen or eyebrows). These results, together with previous studies demonstrating stimulation-evoked complex actions and connectivity to internal organs such as the adrenal medulla, suggest that M1 is punctuated by a system for whole-body action planning, the somato-cognitive action network (SCAN). In M1, two parallel systems intertwine, forming an integrate-isolate pattern: effector-specific regions (foot, hand and mouth) for isolating fine motor control and the SCAN for integrating goals, physiology and body movement.&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">7960</style></issue></record></records></xml>