Now showing 1 - 7 of 7
  • 2018Journal Article
    [["dc.bibliographiccitation.firstpage","557"],["dc.bibliographiccitation.journal","NeuroImage"],["dc.bibliographiccitation.lastpage","569"],["dc.bibliographiccitation.volume","179"],["dc.contributor.author","Hammerschmidt, Wiebke"],["dc.contributor.author","Kagan, Igor"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2020-12-10T15:20:29Z"],["dc.date.available","2020-12-10T15:20:29Z"],["dc.date.issued","2018"],["dc.identifier.doi","10.1016/j.neuroimage.2018.06.055"],["dc.identifier.issn","1053-8119"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/72681"],["dc.language.iso","en"],["dc.notes.intern","DOI Import GROB-354"],["dc.title","Implicit reward associations impact face processing: Time-resolved evidence from event-related brain potentials and pupil dilations"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2022Journal Article
    [["dc.bibliographiccitation.issue","11"],["dc.bibliographiccitation.journal","Psychophysiology"],["dc.bibliographiccitation.volume","59"],["dc.contributor.affiliation","Brümmer, Lena; 2\r\nAffective Neuroscience and Psychophysiology\r\nGeorg‐August‐Universität Göttingen\r\nGöttingen Germany"],["dc.contributor.affiliation","Pooresmaeili, Arezoo; 3\r\nLeibniz ScienceCampus Primate Cognition\r\nGöttingen Germany"],["dc.contributor.affiliation","Schacht, Annekathrin; 2\r\nAffective Neuroscience and Psychophysiology\r\nGeorg‐August‐Universität Göttingen\r\nGöttingen Germany"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Brümmer, Lena"],["dc.contributor.author","Pooresmaeili, Arezoo"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2022-11-28T09:30:48Z"],["dc.date.available","2022-11-28T09:30:48Z"],["dc.date.issued","2022"],["dc.date.updated","2022-11-27T10:10:57Z"],["dc.description.abstract","Abstract\r\nNumerous different objects are simultaneously visible in a person's visual field, competing for attention. This competition has been shown to affect eye‐movements and early neural responses toward stimuli, while the role of a stimulus' emotional meaning for mechanisms of overt attention shifts under competition is unclear. The current study combined EEG and eye‐tracking to investigate effects of competition and emotional content on overt shifts of attention to human face stimuli. Competition prolonged the latency of the P1 component and of saccades, while faces showing emotional expressions elicited an early posterior negativity (EPN). Remarkably, the emotion‐related modulation of the EPN was attenuated when two stimuli were competing for attention compared to non‐competition. In contrast, no interaction effects of emotional expression and competition were observed on other event‐related potentials. This finding indicates that competition can decelerate attention shifts in general and also diminish the emotion‐driven attention capture, measured through the smaller effects of emotional expression on EPN amplitude. Reduction of the brain's responsiveness to emotional content in the presence of distractors contradicts models that postulate fully automatic processing of emotions."],["dc.description.abstract","The current study investigated overt attention to emotional faces using coregistered eye‐tracking and EEG, measuring whether emotional expressions affect early and late Event‐related Potentials. It shows that emotion effects are attenuated and delayed when several targets are competing for attention, providing evidence that cortical processing of emotion is not automatic but is influenced by distractors."],["dc.description.sponsorship","Göttingen University"],["dc.description.sponsorship","H2020 European Research Council http://dx.doi.org/10.13039/100010663"],["dc.description.sponsorship","Leibniz ScienceCampus Primate Cognition"],["dc.identifier.doi","10.1111/psyp.14087"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/117288"],["dc.language.iso","en"],["dc.notes.intern","DOI-Import GROB-572"],["dc.relation.eissn","1469-8986"],["dc.relation.issn","0048-5772"],["dc.rights","This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited."],["dc.rights.uri","http://creativecommons.org/licenses/by/4.0/"],["dc.title","Visual competition attenuates emotion effects during overt attention shifts"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2020Journal Article
    [["dc.bibliographiccitation.journal","Frontiers in Psychology"],["dc.bibliographiccitation.volume","11"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Feyerabend, Dennis"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2020-12-10T18:46:51Z"],["dc.date.available","2020-12-10T18:46:51Z"],["dc.date.issued","2020"],["dc.description.abstract","Human faces express emotions, informing others about their affective states. In order to measure expressions of emotion, facial Electromyography (EMG) has widely been used, requiring electrodes and technical equipment. More recently, emotion recognition software has been developed that detects emotions from video recordings of human faces. However, its validity and comparability to EMG measures is unclear. The aim of the current study was to compare the Affectiva Affdex emotion recognition software by iMotions with EMG measurements of the zygomaticus mayor and corrugator supercilii muscle, concerning its ability to identify happy, angry and neutral faces. Twenty participants imitated these facial expressions while videos and EMG were recorded. Happy and angry expressions were detected by both the software and by EMG above chance, while neutral expressions were more often falsely identified as negative by EMG compared to the software. Overall, EMG and software values correlated highly. In conclusion, Affectiva Affdex software can identify facial expressions and its results are comparable to EMG findings."],["dc.identifier.doi","10.3389/fpsyg.2020.00329"],["dc.identifier.eissn","1664-1078"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/17393"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/78569"],["dc.language.iso","en"],["dc.notes.intern","DOI Import GROB-354"],["dc.publisher","Frontiers Media S.A."],["dc.relation.eissn","1664-1078"],["dc.rights","http://creativecommons.org/licenses/by/4.0/"],["dc.title","A Comparison of the Affectiva iMotions Facial Expression Analysis Software With EMG for Identifying Facial Expressions of Emotion"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2019Journal Article
    [["dc.bibliographiccitation.firstpage","182"],["dc.bibliographiccitation.journal","Neuropsychologia"],["dc.bibliographiccitation.lastpage","191"],["dc.bibliographiccitation.volume","124"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Bayer, Mareike"],["dc.contributor.author","Grimm, Anna-Maria"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2019-07-30T07:29:28Z"],["dc.date.available","2019-07-30T07:29:28Z"],["dc.date.issued","2019"],["dc.description.abstract","Associated stimulus valence affects neural responses at an early processing stage. However, in the field of written language processing, it is unclear whether semantics of a word or low-level visual features affect early neural processing advantages. The current study aimed to investigate the role of semantic content on reward and loss associations. Participants completed a learning session to associate either words (Experiment 1, N = 24) or pseudowords (Experiment 2, N = 24) with different monetary outcomes (gain-associated, neutral or loss-associated). Gain-associated stimuli were learned fastest. Behavioural and neural response changes based on the associated outcome were further investigated in separate test sessions. Responses were faster towards gain- and loss-associated than neutral stimuli if they were words, but not pseudowords. Early P1 effects of associated outcome occurred for both pseudowords and words. Specifically, loss-association resulted in increased P1 amplitudes to pseudowords, compared to decreased amplitudes to words. Although visual features are likely to explain P1 effects for pseudowords, the inversed effect for words suggests that semantic content affects associative learning, potentially leading to stronger associations."],["dc.identifier.doi","10.1016/j.neuropsychologia.2018.12.012"],["dc.identifier.pmid","30571974"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/62169"],["dc.language.iso","en"],["dc.relation.eissn","1873-3514"],["dc.relation.issn","0028-3932"],["dc.title","Differential effects of learned associations with words and pseudowords on event-related brain potentials"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI PMID PMC
  • 2018Journal Article
    [["dc.bibliographiccitation.artnumber","e0206142"],["dc.bibliographiccitation.issue","10"],["dc.bibliographiccitation.journal","PLOS ONE"],["dc.bibliographiccitation.volume","13"],["dc.contributor.author","Hammerschmidt, Wiebke"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Broering, Christina"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2019-07-09T11:49:40Z"],["dc.date.available","2019-07-09T11:49:40Z"],["dc.date.issued","2018"],["dc.description.abstract","n comparison to neutral faces, facial expressions of emotion are known to gain attentional prioritization, mainly demonstrated by means of event-related potentials (ERPs). Recent evidence indicated that such a preferential processing can also be elicited by neutral faces when associated with increased motivational salience via reward. It remains, however, an open question whether impacts of inherent emotional salience and associated motivational salience might be integrated. To this aim, expressions and monetary outcomes were orthogonally combined. Participants (N = 42) learned to explicitly categorize happy and neutral faces as either reward- or zero-outcome-related via an associative learning paradigm. ERP components (P1, N170, EPN, and LPC) were measured throughout the experiment, and separately analyzed before (learning phase) and after (consolidation phase) reaching a pre-defined learning criterion. Happy facial expressions boosted early processing stages, as reflected in enhanced amplitudes of the N170 and EPN, both during learning and consolidation. In contrast, effects of monetary reward became evident only after successful learning and in form of enlarged amplitudes of the LPC, a component linked to higher-order evaluations. Interactions between expressions and associated outcome were absent in all ERP components of interest. The present study provides novel evidence that acquired salience impacts stimulus processing but independent of the effects driven by happy facial expressions."],["dc.identifier.doi","10.1371/journal.pone.0206142"],["dc.identifier.pmid","30359397"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/15736"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/59602"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.subject.ddc","570"],["dc.title","Money or smiles: Independent ERP effects of associated monetary reward and happy faces"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]
    Details DOI PMID PMC
  • 2017Preprint
    [["dc.contributor.author","Hammerschmidt, Wiebke"],["dc.contributor.author","Kagan, Igor"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2018-02-08T10:14:01Z"],["dc.date.available","2018-02-08T10:14:01Z"],["dc.date.issued","2017"],["dc.description.abstract","The present study aimed at investigating whether associated motivational salience causes preferential processing of inherently neutral faces similar to emotional expressions by means of event-related brain potentials (ERPs) and changes of the pupil size. To this aim, neutral faces were implicitly associated with monetary outcome, while participants (N = 44) performed a subliminal face-matching task that ensured performance around chance level and thus an equal proportion of gain, loss, and zero outcomes. Motivational context strongly impacted processing of all - even task-irrelevant - stimuli prior to the target face, indicated by enhanced amplitudes of subsequent ERP components and increased pupil size. In a separate test session, previously associated faces as well as novel faces with emotional expressions were presented within the same task but without motivational context and performance feedback. Most importantly, previously gain-associated faces amplified the LPC, although the individually contingent face-outcome assignments were not made explicit during the learning session. Emotional expressions impacted the N170 and EPN components. Modulations of the pupil size were absent in both motivationally-associated and emotional conditions. Our findings demonstrate that neural representations of neutral stimuli can acquire increased salience via implicit learning, with an advantage for gain over loss associations."],["dc.identifier.doi","10.1101/232538"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/12054"],["dc.language.iso","en"],["dc.notes.status","zu prüfen"],["dc.title","Implicit reward associations impact face processing: Time-resolved evidence from event-related brain potentials and pupil dilations"],["dc.type","preprint"],["dc.type.internalPublication","unknown"],["dspace.entity.type","Publication"]]
    Details DOI
  • 2021Journal Article
    [["dc.bibliographiccitation.journal","Psychophysiology"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Brümmer, Lena"],["dc.contributor.author","Pooresmaeili, Arezoo"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2021-06-01T09:42:05Z"],["dc.date.available","2021-06-01T09:42:05Z"],["dc.date.issued","2021"],["dc.identifier.doi","10.1111/psyp.13838"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/85138"],["dc.language.iso","en"],["dc.notes.intern","DOI-Import GROB-425"],["dc.relation.eissn","1469-8986"],["dc.relation.issn","0048-5772"],["dc.title","Overt and covert attention shifts to emotional faces: Combining EEG, eye tracking, and a go/no‐go paradigm"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dspace.entity.type","Publication"]]
    Details DOI