Options
Schacht, Annekathrin
Loading...
Preferred name
Schacht, Annekathrin
Official Name
Schacht, Annekathrin
Alternative Name
Schacht, A.
Main Affiliation
Now showing 1 - 2 of 2
2018Journal Article [["dc.bibliographiccitation.artnumber","882"],["dc.bibliographiccitation.journal","Frontiers in Psychology"],["dc.bibliographiccitation.volume","9"],["dc.contributor.author","Lausen, Adi"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2019-07-09T11:45:39Z"],["dc.date.available","2019-07-09T11:45:39Z"],["dc.date.issued","2018"],["dc.description.abstract","The conflicting findings from the few studies conducted with regard to gender differences in the recognition of vocal expressions of emotion have left the exact nature of these differences unclear. Several investigators have argued that a comprehensive understanding of gender differences in vocal emotion recognition can only be achieved by replicating these studies while accounting for influential factors such as stimulus type, gender-balanced samples, number of encoders, decoders, and emotional categories. This study aimed to account for these factors by investigating whether emotion recognition from vocal expressions differs as a function of both listeners' and speakers' gender. A total of N = 290 participants were randomly and equally allocated to two groups. One group listened to words and pseudo-words, while the other group listened to sentences and affect bursts. Participants were asked to categorize the stimuli with respect to the expressed emotions in a fixed-choice response format. Overall, females were more accurate than males when decoding vocal emotions, however, when testing for specific emotions these differences were small in magnitude. Speakers' gender had a significant impact on how listeners' judged emotions from the voice. The group listening to words and pseudo-words had higher identification rates for emotions spoken by male than by female actors, whereas in the group listening to sentences and affect bursts the identification rates were higher when emotions were uttered by female than male actors. The mixed pattern for emotion-specific effects, however, indicates that, in the vocal channel, the reliability of emotion judgments is not systematically influenced by speakers' gender and the related stereotypes of emotional expressivity. Together, these results extend previous findings by showing effects of listeners' and speakers' gender on the recognition of vocal emotions. They stress the importance of distinguishing these factors to explain recognition ability in the processing of emotional prosody."],["dc.identifier.doi","10.3389/fpsyg.2018.00882"],["dc.identifier.pmid","29922202"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/15273"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/59277"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.publisher","Frontiers Media S.A."],["dc.relation.eissn","1664-1078"],["dc.relation.issn","1664-1078"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.subject.ddc","570"],["dc.title","Gender Differences in the Recognition of Vocal Emotions"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC2018Journal Article [["dc.bibliographiccitation.artnumber","e0206142"],["dc.bibliographiccitation.issue","10"],["dc.bibliographiccitation.journal","PLOS ONE"],["dc.bibliographiccitation.volume","13"],["dc.contributor.author","Hammerschmidt, Wiebke"],["dc.contributor.author","Kulke, Louisa"],["dc.contributor.author","Broering, Christina"],["dc.contributor.author","Schacht, Annekathrin"],["dc.date.accessioned","2019-07-09T11:49:40Z"],["dc.date.available","2019-07-09T11:49:40Z"],["dc.date.issued","2018"],["dc.description.abstract","n comparison to neutral faces, facial expressions of emotion are known to gain attentional prioritization, mainly demonstrated by means of event-related potentials (ERPs). Recent evidence indicated that such a preferential processing can also be elicited by neutral faces when associated with increased motivational salience via reward. It remains, however, an open question whether impacts of inherent emotional salience and associated motivational salience might be integrated. To this aim, expressions and monetary outcomes were orthogonally combined. Participants (N = 42) learned to explicitly categorize happy and neutral faces as either reward- or zero-outcome-related via an associative learning paradigm. ERP components (P1, N170, EPN, and LPC) were measured throughout the experiment, and separately analyzed before (learning phase) and after (consolidation phase) reaching a pre-defined learning criterion. Happy facial expressions boosted early processing stages, as reflected in enhanced amplitudes of the N170 and EPN, both during learning and consolidation. In contrast, effects of monetary reward became evident only after successful learning and in form of enlarged amplitudes of the LPC, a component linked to higher-order evaluations. Interactions between expressions and associated outcome were absent in all ERP components of interest. The present study provides novel evidence that acquired salience impacts stimulus processing but independent of the effects driven by happy facial expressions."],["dc.identifier.doi","10.1371/journal.pone.0206142"],["dc.identifier.pmid","30359397"],["dc.identifier.purl","https://resolver.sub.uni-goettingen.de/purl?gs-1/15736"],["dc.identifier.uri","https://resolver.sub.uni-goettingen.de/purl?gro-2/59602"],["dc.language.iso","en"],["dc.notes.intern","Merged from goescholar"],["dc.rights","CC BY 4.0"],["dc.rights.uri","https://creativecommons.org/licenses/by/4.0"],["dc.subject.ddc","570"],["dc.title","Money or smiles: Independent ERP effects of associated monetary reward and happy faces"],["dc.type","journal_article"],["dc.type.internalPublication","yes"],["dc.type.version","published_version"],["dspace.entity.type","Publication"]]Details DOI PMID PMC