Options
Calapai, Antonino
Loading...
Preferred name
Calapai, Antonino
Official Name
Calapai, Antonino
Alternative Name
Calapai, A.
Now showing 1 - 5 of 5
2016Journal Article [["dc.bibliographiccitation.firstpage","35"],["dc.bibliographiccitation.issue","1"],["dc.bibliographiccitation.journal","Behavior Research Methods"],["dc.bibliographiccitation.lastpage","45"],["dc.bibliographiccitation.volume","49"],["dc.contributor.author","Calapai, A."],["dc.contributor.author","Berger, M."],["dc.contributor.author","Niessing, M."],["dc.contributor.author","Heisig, K."],["dc.contributor.author","Brockhausen, R."],["dc.contributor.author","Treue, S."],["dc.contributor.author","Gail, A."],["dc.date.accessioned","2017-09-07T11:47:46Z"],["dc.date.available","2017-09-07T11:47:46Z"],["dc.date.issued","2016"],["dc.description.abstract","In neurophysiological studies with awake non-human primates (NHP), it is typically necessary to train the animals over a prolonged period of time on a behavioral paradigm before the actual data collection takes place. Rhesus monkeys (Macaca mulatta) are the most widely used primate animal models in system neuroscience. Inspired by existing joystick- or touch-screen-based systems designed for a variety of monkey species, we built and successfully employed a stand-alone cage-based training and testing system for rhesus monkeys (eXperimental Behavioral Intrument, XBI). The XBI is mobile and easy to handle by both experts and non-experts; animals can work with only minimal physical restraints, yet the ergonomic design successfully encourages stereotypical postures with a consistent positioning of the head relative to the screen. The XBI allows computer-controlled training of the monkeys with a large variety of behavioral tasks and reward protocols typically used in systems and cognitive neuroscience research."],["dc.identifier.doi","10.3758/s13428-016-0707-3"],["dc.identifier.gro","3150724"],["dc.identifier.pmid","26896242"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/13181"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/7512"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.notes.status","final"],["dc.notes.submitter","chake"],["dc.relation.issn","1554-3528"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","A cage-based training, cognitive testing and enrichment system optimized for rhesus macaques in neuroscience research"],["dc.type","journal_article"],["dc.type.internalPublication","unknown"],["dc.type.peerReviewed","no"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC2017Journal Article [["dc.bibliographiccitation.artnumber","jn.00614.2017"],["dc.bibliographiccitation.journal","Journal of Neurophysiology"],["dc.contributor.author","Berger, Michael"],["dc.contributor.author","Calapai, Antonino"],["dc.contributor.author","Stephan, Valeska"],["dc.contributor.author","Niessing, Michael"],["dc.contributor.author","Burchardt, Leonore"],["dc.contributor.author","Gail, Alexander"],["dc.contributor.author","Treue, Stefan"],["dc.date.accessioned","2018-01-17T13:11:54Z"],["dc.date.available","2018-01-17T13:11:54Z"],["dc.date.issued","2017"],["dc.description.abstract","Teaching non-human primates the complex cognitive behavioral tasks that are central to cognitive neuroscience research is an essential and challenging endeavor. It is crucial for the scientific success that the animals learn to interpret the often complex task rules, and reliably and enduringly act accordingly. To achieve consistent behavior and comparable learning histories across animals, it is desirable to standardize training protocols. Automatizing the training can significantly reduce the time invested by the person training the animal. And self-paced training schedules with individualized learning speeds based on automatic updating of task conditions could enhance the animals' motivation and welfare. We developed a training paradigm for across-task unsupervised training (AUT) of successively more complex cognitive tasks to be administered through a stand-alone housing-based system optimized for rhesus monkeys in neuroscience research settings (Calapai et al. 2016). The AUT revealed inter-individual differences in long-term learning progress between animals, helping to characterize learning personalities, and commonalities, helping to identify easier and more difficult learning steps in the training protocol. Our results demonstrate that (1) rhesus monkeys stay engaged with the AUT over months despite access to water and food outside the experimental sessions, but with lower numbers of interaction compared to conventional fluid-controlled training; (2) with unsupervised training across sessions and task levels, rhesus monkeys can learn tasks of sufficient complexity for state-of-the art cognitive neuroscience in their housing environment; (3) AUT learning progress is primarily determined by the number of interactions with the system rather than the mere exposure time."],["dc.identifier.doi","10.1152/jn.00614.2017"],["dc.identifier.pmid","29142094"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/11705"],["dc.language.iso","en"],["dc.notes.status","zu prüfen"],["dc.relation.eissn","1522-1598"],["dc.title","Standardized automated training of rhesus monkeys for neuroscience research in their housing environment"],["dc.type","journal_article"],["dc.type.internalPublication","unknown"],["dspace.entity.type","Publication"]]Details DOI PMID PMC2022-11-29Journal Article [["dc.bibliographiccitation.journal","Frontiers in Psychology"],["dc.bibliographiccitation.volume","13"],["dc.contributor.affiliation","Cabrera-Moreno, Jorge; 1Cognitive Hearing in Primates (CHiP) Group, Auditory Neuroscience and Optogenetics Laboratory, German Primate Center, Leibniz-Institute for Primate Research, Göttingen, Germany"],["dc.contributor.affiliation","Jeanson, Lena; 1Cognitive Hearing in Primates (CHiP) Group, Auditory Neuroscience and Optogenetics Laboratory, German Primate Center, Leibniz-Institute for Primate Research, Göttingen, Germany"],["dc.contributor.affiliation","Jeschke, Marcus; 1Cognitive Hearing in Primates (CHiP) Group, Auditory Neuroscience and Optogenetics Laboratory, German Primate Center, Leibniz-Institute for Primate Research, Göttingen, Germany"],["dc.contributor.affiliation","Calapai, Antonino; 1Cognitive Hearing in Primates (CHiP) Group, Auditory Neuroscience and Optogenetics Laboratory, German Primate Center, Leibniz-Institute for Primate Research, Göttingen, Germany"],["dc.contributor.author","Cabrera-Moreno, Jorge"],["dc.contributor.author","Jeanson, Lena"],["dc.contributor.author","Jeschke, Marcus"],["dc.contributor.author","Calapai, Antonino"],["dc.date.accessioned","2022-12-13T09:33:36Z"],["dc.date.available","2022-12-13T09:33:36Z"],["dc.date.issued","2022-11-29"],["dc.date.updated","2022-12-13T08:47:59Z"],["dc.description.abstract","In recent years, the utility and efficiency of automated procedures for cognitive assessment in psychology and neuroscience have been demonstrated in non-human primates (NHP). This approach mimics conventional shaping principles of breaking down a final desired behavior into smaller components that can be trained in a staircase manner. When combined with home-cage-based approaches, this could lead to a reduction in human workload, enhancement in data quality, and improvement in animal welfare. However, to our knowledge, there are no reported attempts to develop automated training and testing protocols for long-tailed macaques (Macaca fascicularis), a ubiquitous NHP model in neuroscience and pharmaceutical research. In the current work, we present the results from 6 long-tailed macaques that were trained using an automated unsupervised training (AUT) protocol for introducing the animals to the basics of a two-alternative choice (2 AC) task where they had to discriminate a conspecific vocalization from a pure tone relying on images presented on a touchscreen to report their response. We found that animals (1) consistently engaged with the device across several months; (2) interacted in bouts of high engagement; (3) alternated peacefully to interact with the device; and (4) smoothly ascended from step to step in the visually guided section of the procedure, in line with previous results from other NHPs. However, we also found (5) that animals’ performance remained at chance level as soon as the acoustically guided steps were reached; and (6) that the engagement level decreased significantly with decreasing performance during the transition from visual to acoustic-guided sections. We conclude that with an autonomous approach, it is possible to train long-tailed macaques in their social group using computer vision techniques and without dietary restriction to solve a visually guided discrimination task but not an acoustically guided task. We provide suggestions on what future attempts could take into consideration to instruct acoustically guided discrimination tasks successfully."],["dc.identifier.doi","10.3389/fpsyg.2022.1047242"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/118547"],["dc.language.iso","en"],["dc.relation.eissn","1664-1078"],["dc.rights","CC BY 4.0"],["dc.rights.uri","http://creativecommons.org/licenses/by/4.0/"],["dc.title","Group-based, autonomous, individualized training and testing of long-tailed macaques (Macaca fascicularis) in their home enclosure to a visuo-acoustic discrimination task"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI2022Journal Article Research Paper [["dc.bibliographiccitation.artnumber","1648"],["dc.bibliographiccitation.issue","1"],["dc.bibliographiccitation.journal","Nature Communications"],["dc.bibliographiccitation.volume","13"],["dc.contributor.author","Calapai, Antonino"],["dc.contributor.author","Cabrera-Moreno, J."],["dc.contributor.author","Moser, Tobias"],["dc.contributor.author","Jeschke, Marcus"],["dc.date.accessioned","2022-04-01T10:01:42Z"],["dc.date.available","2022-04-01T10:01:42Z"],["dc.date.issued","2022"],["dc.description.abstract","Abstract Devising new and more efficient protocols to analyze the phenotypes of non-human primates, as well as their complex nervous systems, is rapidly becoming of paramount importance. This is because with genome-editing techniques, recently adopted to non-human primates, new animal models for fundamental and translational research have been established. One aspect in particular, namely cognitive hearing, has been difficult to assess compared to visual cognition. To address this, we devised autonomous, standardized, and unsupervised training and testing of auditory capabilities of common marmosets with a cage-based standalone, wireless system. All marmosets tested voluntarily operated the device on a daily basis and went from naïve to experienced at their own pace and with ease. Through a series of experiments, here we show, that animals autonomously learn to associate sounds with images; to flexibly discriminate sounds, and to detect sounds of varying loudness. The developed platform and training principles combine in-cage training of common marmosets for cognitive and psychoacoustic assessment with an enriched environment that does not rely on dietary restriction or social separation, in compliance with the 3Rs principle."],["dc.identifier.doi","10.1038/s41467-022-29185-9"],["dc.identifier.pii","29185"],["dc.identifier.pmid","35347139"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/105732"],["dc.identifier.url","https://mbexc.uni-goettingen.de/literature/publications/470"],["dc.language.iso","en"],["dc.notes.intern","DOI-Import GROB-530"],["dc.relation","EXC 2067: Multiscale Bioimaging"],["dc.relation.eissn","2041-1723"],["dc.relation.workinggroup","RG Moser (Molecular Anatomy, Physiology and Pathology of Sound Encoding)"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.title","Flexible auditory training, psychophysics, and enrichment of common marmosets with an automated, touchscreen-based system"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.subtype","original_ja"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC2020Journal Article [["dc.bibliographiccitation.issue","1"],["dc.bibliographiccitation.journal","Scientific Reports"],["dc.bibliographiccitation.volume","10"],["dc.contributor.author","Xue, Cheng"],["dc.contributor.author","Calapai, Antonino"],["dc.contributor.author","Krumbiegel, Julius"],["dc.contributor.author","Treue, Stefan"],["dc.date.accessioned","2021-06-01T10:50:42Z"],["dc.date.available","2021-06-01T10:50:42Z"],["dc.date.issued","2020"],["dc.description.abstract","Abstract Small ballistic eye movements, so called microsaccades, occur even while foveating an object. Previous studies using covert attention tasks have shown that shortly after a symbolic spatial cue, specifying a behaviorally relevant location, microsaccades tend to be directed toward the cued location. This suggests that microsaccades can serve as an index for the covert orientation of spatial attention. However, this hypothesis faces two major challenges: First, effects associated with visual spatial attention are hard to distinguish from those that associated with the contemplation of foveating a peripheral stimulus. Second, it is less clear whether endogenously sustained attention alone can bias microsaccade directions without a spatial cue on each trial. To address the first issue, we investigated the direction of microsaccades in human subjects while they attended to a behaviorally relevant location and prepared a response eye movement either toward or away from this location. We find that directions of microsaccades are biased toward the attended location rather than towards the saccade target. To tackle the second issue, we verbally indicated the location to attend before the start of each block of trials, to exclude potential visual cue-specific effects on microsaccades. Our results indicate that sustained spatial attention alone reliably produces the microsaccade direction effect. Overall, our findings demonstrate that sustained spatial attention alone, even in the absence of saccade planning or a spatial cue, is sufficient to explain the direction bias observed in microsaccades."],["dc.identifier.doi","10.1038/s41598-020-77455-7"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/86756"],["dc.language.iso","en"],["dc.notes.intern","DOI-Import GROB-425"],["dc.relation.eissn","2045-2322"],["dc.title","Sustained spatial attention accounts for the direction bias of human microsaccades"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]Details DOI