Now showing 1 - 8 of 8
  • 2016-08-15Journal Article
    [["dc.bibliographiccitation.artnumber","e15719"],["dc.bibliographiccitation.journal","eLife"],["dc.bibliographiccitation.volume","5"],["dc.contributor.author","Dann, Benjamin"],["dc.contributor.author","Michaels, Jonathan A."],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2016-10-20T12:02:12Z"],["dc.date.accessioned","2021-10-27T13:12:53Z"],["dc.date.available","2016-10-20T12:02:12Z"],["dc.date.available","2021-10-27T13:12:53Z"],["dc.date.issued","2016-08-15"],["dc.description.abstract","The functional communication of neurons in cortical networks underlies higher cognitive processes. Yet, little is known about the organization of the single neuron network or its relationship to the synchronization processes that are essential for its formation. Here, we show that the functional single neuron network of three fronto-parietal areas during active behavior of macaque monkeys is highly complex. The network was closely connected (small-world) and consisted of functional modules spanning these areas. Surprisingly, the importance of different neurons to the network was highly heterogeneous with a small number of neurons contributing strongly to the network function (hubs), which were in turn strongly inter-connected (rich-club). Examination of the network synchronization revealed that the identified rich-club consisted of neurons that were synchronized in the beta or low frequency range, whereas other neurons were mostly non-oscillatory synchronized. Therefore, oscillatory synchrony may be a central communication mechanism for highly organized functional spiking networks."],["dc.identifier.doi","10.7554/eLife.15719"],["dc.identifier.fs","622743"],["dc.identifier.gro","3151420"],["dc.identifier.pmid","27525488"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/13789"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/91731"],["dc.language.iso","en"],["dc.notes.intern","Migrated from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.euproject","NEBIAS"],["dc.relation.issn","2050-084X"],["dc.relation.orgunit","Deutsches Primatenzentrum"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","Uniting functional network topology and oscillations in the fronto-parietal single unit network of behaving primates."],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]
    Details DOI PMID PMC
  • 2015Journal Article
    [["dc.bibliographiccitation.artnumber","056016"],["dc.bibliographiccitation.issue","5"],["dc.bibliographiccitation.journal","Journal of Neural Engineering"],["dc.bibliographiccitation.volume","12"],["dc.contributor.author","Menz, Veera Katharina"],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2017-09-07T11:53:53Z"],["dc.date.available","2017-09-07T11:53:53Z"],["dc.date.issued","2015"],["dc.description.abstract","Objective. In the last decade, multiple brain areas have been investigated with respect to their decoding capability of continuous arm or hand movements. So far, these studies have mainly focused on motor or premotor areas like M1 and F5. However, there is accumulating evidence that anterior intraparietal area (AIP) in the parietal cortex also contains information about continuous movement. Approach. In this study, we decoded 27 degrees of freedom representing complete hand and arm kinematics during a delayed grasping task from simultaneously recorded activity in areas M1, F5, and AIP of two macaque monkeys (Macaca mulatta). Main results. We found that all three areas provided decoding performances that lay significantly above chance. In particular, M1 yielded highest decoding accuracy followed by F5 and AIP. Furthermore, we provide support for the notion that AIP does not only code categorical visual features of objects to be grasped, but also contains a substantial amount of temporal kinematic information. Significance. This fact could be utilized in future developments of neural interfaces restoring hand and arm movements."],["dc.identifier.doi","10.1088/1741-2560/12/5/056016"],["dc.identifier.gro","3151408"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/12572"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/8206"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.issn","1741-2560"],["dc.rights","CC BY 3.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/3.0"],["dc.title","Representation of continuous hand and arm movements in macaque areas M1, F5, and AIP: a comparative decoding study"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","no"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2020Journal Article
    [["dc.bibliographiccitation.firstpage","32124"],["dc.bibliographiccitation.issue","50"],["dc.bibliographiccitation.journal","Proceedings of the National Academy of Sciences"],["dc.bibliographiccitation.lastpage","32135"],["dc.bibliographiccitation.volume","117"],["dc.contributor.author","Michaels, Jonathan A."],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Agudelo-Toro, Andres"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2021-04-14T08:25:57Z"],["dc.date.available","2021-04-14T08:25:57Z"],["dc.date.issued","2020"],["dc.identifier.doi","10.1073/pnas.2005087117"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/81778"],["dc.language.iso","en"],["dc.notes.intern","DOI Import GROB-399"],["dc.relation.eissn","1091-6490"],["dc.relation.issn","0027-8424"],["dc.title","A goal-driven modular neural network predicts parietofrontal neural dynamics during grasping"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2015-03-01Journal Article
    [["dc.bibliographiccitation.firstpage","210"],["dc.bibliographiccitation.issue","2"],["dc.bibliographiccitation.journal","IEEE Transactions on Neural Systems and Rehabilitation Engineering"],["dc.bibliographiccitation.lastpage","220"],["dc.bibliographiccitation.volume","23"],["dc.contributor.author","Schaffelhofer, S."],["dc.contributor.author","Sartori, M."],["dc.contributor.author","Scherberger, H."],["dc.contributor.author","Farina, D."],["dc.date.accessioned","2015-06-15T12:57:15Z"],["dc.date.accessioned","2021-10-27T13:21:01Z"],["dc.date.available","2015-06-15T12:57:15Z"],["dc.date.available","2021-10-27T13:21:01Z"],["dc.date.issued","2015-03-01"],["dc.description.abstract","Reach-to-grasp tasks have become popular paradigms for exploring the neural origin of hand and arm movements. This is typically investigated by correlating limb kinematic with electrophysiological signals from intracortical recordings. However, it has never been investigated whether reach and grasp movements could be well expressed in the muscle domain and whether this could bring improvements with respect to current joint domain-based task representations. In this study, we trained two macaque monkeys to grasp 50 different objects, which resulted in a high variability of hand configurations. A generic musculoskeletal model of the human upper extremity was scaled and morphed to match the specific anatomy of each individual animal. The primate-specific model was used to perform 3-D reach-to-grasp simulations driven by experimental upper limb kinematics derived from electromagnetic sensors. Simulations enabled extracting joint angles from 27 degrees of freedom and the instantaneous length of 50 musculotendon units. Results demonstrated both a more compact representation and a higher decoding capacity of grasping tasks when movements were expressed in the muscle kinematics domain than when expressed in the joint kinematics domain. Accessing musculoskeletal variables might improve our understanding of cortical hand-grasping areas coding, with implications in the development of prosthetics hands."],["dc.identifier.doi","10.1109/tnsre.2014.2364776"],["dc.identifier.gro","3151429"],["dc.identifier.pmid","25350935"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/11890"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/91988"],["dc.language.iso","en"],["dc.notes.intern","Migrated from goescholar"],["dc.notes.status","public"],["dc.notes.submitter","chake"],["dc.relation.euproject","DEMOVE"],["dc.relation.issn","1558-0210"],["dc.relation.issn","1534-4320"],["dc.relation.orgunit","Bernstein Center for Computational Neuroscience Göttingen"],["dc.rights","Goescholar"],["dc.rights.uri","https://goescholar.uni-goettingen.de/licenses"],["dc.title","Musculoskeletal representation of a large repertoire of hand grasping actions in primates."],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]
    Details DOI PMID PMC
  • 2015Journal Article
    [["dc.bibliographiccitation.firstpage","1068"],["dc.bibliographiccitation.issue","3"],["dc.bibliographiccitation.journal","The Journal of neuroscience"],["dc.bibliographiccitation.lastpage","1081"],["dc.bibliographiccitation.volume","35"],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Agudelo-Toro, Andres"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2017-09-07T11:53:55Z"],["dc.date.available","2017-09-07T11:53:55Z"],["dc.date.issued","2015"],["dc.description.abstract","Despite recent advances in decoding cortical activity for motor control, the development of hand prosthetics remains a major challenge. To reduce the complexity of such applications, higher cortical areas that also represent motor plans rather than just the individual movements might be advantageous. We investigated the decoding of many grip types using spiking activity from the anterior intraparietal (AIP), ventral premotor (F5), and primary motor (M1) cortices. Two rhesus monkeys were trained to grasp 50 objects in a delayed task while hand kinematics and spiking activity from six implanted electrode arrays (total of 192 electrodes) were recorded. Offline, we determined 20 grip types from the kinematic data and decoded these hand configurations and the grasped objects with a simple Bayesian classifier. When decoding from AIP, F5, and M1 combined, the mean accuracy was 50% (using planning activity) and 62% (during motor execution) for predicting the 50 objects (chance level, 2%) and substantially larger when predicting the 20 grip types (planning, 74%; execution, 86%; chance level, 5%). When decoding from individual arrays, objects and grip types could be predicted well during movement planning from AIP (medial array) and F5 (lateral array), whereas M1 predictions were poor. In contrast, predictions during movement execution were best from M1, whereas F5 performed only slightly worse. These results demonstrate for the first time that a large number of grip types can be decoded from higher cortical areas during movement preparation and execution, which could be relevant for future neuroprosthetic devices that decode motor plans."],["dc.identifier.doi","10.1523/jneurosci.3594-14.2015"],["dc.identifier.gro","3151409"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/8208"],["dc.language.iso","en"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.issn","0270-6474"],["dc.title","Decoding a Wide Range of Hand Configurations from Macaque Motor, Premotor, and Parietal Cortices"],["dc.type","journal_article"],["dc.type.internalPublication","unknown"],["dc.type.peerReviewed","no"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2016Journal Article
    [["dc.bibliographiccitation.artnumber","e15278"],["dc.bibliographiccitation.firstpage","1"],["dc.bibliographiccitation.journal","eLife"],["dc.bibliographiccitation.lastpage","24"],["dc.bibliographiccitation.volume","5"],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2016-09-19T06:12:43Z"],["dc.date.accessioned","2021-10-27T13:11:25Z"],["dc.date.available","2016-09-19T06:12:43Z"],["dc.date.available","2021-10-27T13:11:25Z"],["dc.date.issued","2016"],["dc.description.abstract","Grasping requires translating object geometries into appropriate hand shapes. How the brain computes these transformations is currently unclear. We investigated three key areas of the macaque cortical grasping circuit with microelectrode arrays and found cooperative but anatomically separated visual and motor processes. The parietal area AIP operated primarily in a visual mode. Its neuronal population revealed a specialization for shape processing, even for abstract geometries, and processed object features ultimately important for grasping. Premotor area F5 acted as a hub that shared the visual coding of AIP only temporarily and switched to highly dominant motor signals towards movement planning and execution. We visualize these non-discrete premotor signals that drive the primary motor cortex M1 to reflect the movement of the grasping hand. Our results reveal visual and motor features encoded in the grasping circuit and their communication to achieve transformation for grasping."],["dc.identifier.doi","10.7554/eLife.15278"],["dc.identifier.fs","622742"],["dc.identifier.gro","3151436"],["dc.identifier.pmid","27458796"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/13678"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/91594"],["dc.language.iso","en"],["dc.notes.intern","Migrated from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.euproject","NEBIAS"],["dc.relation.issn","2050-084X"],["dc.relation.orgunit","Fakultät für Biologie und Psychologie"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.subject","hand grasping; motor cortex; neuroscience; parallel recording; parietal cortex; premotor cortex; rhesus macaque; sensorimotor transformation"],["dc.title","Object vision to hand action in macaque parietal, premotor, and motor cortices"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.peerReviewed","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]
    Details DOI PMID PMC
  • 2021Journal Article
    [["dc.bibliographiccitation.journal","Frontiers in Behavioral Neuroscience"],["dc.bibliographiccitation.volume","15"],["dc.contributor.author","Buchwald, Daniela"],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Dörge, Matthias"],["dc.contributor.author","Dann, Benjamin"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2021-07-05T14:57:54Z"],["dc.date.available","2021-07-05T14:57:54Z"],["dc.date.issued","2021"],["dc.description.abstract","Grasping movements are some of the most common movements primates do every day. They are important for social interactions as well as picking up objects or food. Usually, these grasping movements are guided by vision but proprioceptive and haptic inputs contribute greatly. Since grasping behaviors are common and easy to motivate, they represent an ideal task for understanding the role of different brain areas during planning and execution of complex voluntary movements in primates. For experimental purposes, a stable and repeatable presentation of the same object as well as the variation of objects is important in order to understand the neural control of movement generation. This is even more the case when investigating the role of different senses for movement planning, where objects need to be presented in specific sensory modalities. We developed a turntable setup for non-human primates (macaque monkeys) to investigate visually and tactually guided grasping movements with an option to easily exchange objects. The setup consists of a turntable that can fit six different objects and can be exchanged easily during the experiment to increase the number of presented objects. The object turntable is connected to a stepper motor through a belt system to automate rotation and hence object presentation. By increasing the distance between the turntable and the stepper motor, metallic components of the stepper motor are kept at a distance to the actual recording setup, which allows using a magnetic-based data glove to track hand kinematics. During task execution, the animal sits in the dark and is instructed to grasp the object in front of it. Options to turn on a light above the object allow for visual presentation of the objects, while the object can also remain in the dark for exclusive tactile exploration. A red LED is projected onto the object by a one-way mirror that serves as a grasp cue instruction for the animal to start grasping the object. By comparing kinematic data from the magnetic-based data glove with simultaneously recorded neural signals, this setup enables the systematic investigation of neural population activity involved in the neural control of hand grasping movements."],["dc.description.abstract","Grasping movements are some of the most common movements primates do every day. They are important for social interactions as well as picking up objects or food. Usually, these grasping movements are guided by vision but proprioceptive and haptic inputs contribute greatly. Since grasping behaviors are common and easy to motivate, they represent an ideal task for understanding the role of different brain areas during planning and execution of complex voluntary movements in primates. For experimental purposes, a stable and repeatable presentation of the same object as well as the variation of objects is important in order to understand the neural control of movement generation. This is even more the case when investigating the role of different senses for movement planning, where objects need to be presented in specific sensory modalities. We developed a turntable setup for non-human primates (macaque monkeys) to investigate visually and tactually guided grasping movements with an option to easily exchange objects. The setup consists of a turntable that can fit six different objects and can be exchanged easily during the experiment to increase the number of presented objects. The object turntable is connected to a stepper motor through a belt system to automate rotation and hence object presentation. By increasing the distance between the turntable and the stepper motor, metallic components of the stepper motor are kept at a distance to the actual recording setup, which allows using a magnetic-based data glove to track hand kinematics. During task execution, the animal sits in the dark and is instructed to grasp the object in front of it. Options to turn on a light above the object allow for visual presentation of the objects, while the object can also remain in the dark for exclusive tactile exploration. A red LED is projected onto the object by a one-way mirror that serves as a grasp cue instruction for the animal to start grasping the object. By comparing kinematic data from the magnetic-based data glove with simultaneously recorded neural signals, this setup enables the systematic investigation of neural population activity involved in the neural control of hand grasping movements."],["dc.identifier.doi","10.3389/fnbeh.2021.648483"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/87768"],["dc.notes.intern","DOI-Import GROB-441"],["dc.relation.eissn","1662-5153"],["dc.title","A Turntable Setup for Testing Visual and Tactile Grasping Movements in Non-human Primates"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2012Journal Article
    [["dc.bibliographiccitation.artnumber","026025"],["dc.bibliographiccitation.issue","2"],["dc.bibliographiccitation.journal","Journal of Neural Engineering"],["dc.bibliographiccitation.volume","9"],["dc.contributor.author","Schaffelhofer, Stefan"],["dc.contributor.author","Scherberger, Hansjörg"],["dc.date.accessioned","2017-09-07T11:53:56Z"],["dc.date.available","2017-09-07T11:53:56Z"],["dc.date.issued","2012"],["dc.description.abstract","The investigation of grasping movements in cortical motor areas depends heavily on the measurement of hand kinematics. Currently used methods for small primates need either a large number of sensors or provide insufficient accuracy. Here, we present both a novel glove based on electromagnetic tracking sensors that can operate at a rate of 100 Hz and a new modeling method that allows to monitor 27 degrees of freedom (DOF) of the hand and arm using only seven sensors. A rhesus macaque was trained to wear the glove while performing precision and power grips during a delayed grasping task in the dark without noticeable hindrance. During five recording sessions all 27 joint angles and their positions could be tracked reliably. Furthermore, the field generator did not interfere with electrophysiological recordings below 1 kHz and did not affect single-cell separation. Measurements with the glove proved to be accurate during static and dynamic testing (mean absolute error below 2° and 3°, respectively). This makes the glove a suitable solution for characterizing electrophysiological signals with respect to hand grasping and in particular for brain–machine interface applications."],["dc.identifier.doi","10.1088/1741-2560/9/2/026025"],["dc.identifier.gro","3151428"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/8402"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/8228"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","public"],["dc.notes.submitter","chake"],["dc.relation.issn","1741-2560"],["dc.rights","CC BY-NC-SA 3.0"],["dc.rights.uri","https://creativecommons.org/licenses/by-nc-sa/3.0"],["dc.title","A new method of accurate hand- and arm-tracking for small primates"],["dc.type","journal_article"],["dc.type.internalPublication","unknown"],["dc.type.peerReviewed","no"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]
    Details DOI