Refine
Year of publication
Document Type
- Article (14)
- Postprint (5)
- Doctoral Thesis (3)
- Conference Proceeding (1)
Is part of the Bibliography
- yes (23)
Keywords
- emotion (23) (remove)
Institute
- Department Psychologie (11)
- Strukturbereich Kognitionswissenschaften (6)
- Fakultät für Gesundheitswissenschaften (1)
- Institut für Anglistik und Amerikanistik (1)
- Institut für Germanistik (1)
- Institut für Informatik und Computational Science (1)
- Institut für Philosophie (1)
- Institut für Umweltwissenschaften und Geographie (1)
- Mathematisch-Naturwissenschaftliche Fakultät (1)
German football stadiums are well known for their atmosphere. It is often described as 'electrifying,' or 'cracking.' This article focuses on this atmosphere. Using a phenomenological approach, it explores how this emotionality can be understood and how geography matters while attending a match. Atmosphere in this context is conceptualized based on work by as a mood-charged space, neither object- nor subject-centered, but rather a medium of perception which cannot not exist. Based on qualitative research done in the home stadium of Hertha BSC in the German Bundesliga, this article shows that the bodily sensations experienced by spectators during a visit to the stadium are synchronized with events on the pitch and with the more or less imposing scenery. The analysis ofin situdiaries reveals that spectators experience a comprehensive sense of collectivity. The study presents evidence that the occurrence of these bodily sensations is strongly connected with different aspects of spatiality. This includes sensations of constriction and expansion within the body, an awareness of one's location within the stadium, the influence of the immediate surroundings and cognitive here/there and inside/outside distinctions.
German football stadiums are well known for their atmosphere. It is often described as ‘electrifying,’ or ‘cracking.’ This article focuses on this atmosphere. Using a phenomenological approach, it explores how this emotionality can be understood and how geography matters while attending a match. Atmosphere in this context is conceptualized based on work by as a mood-charged space, neither object- nor subject-centered, but rather a medium of perception which cannot not exist. Based on qualitative research done in the home stadium of Hertha BSC in the German Bundesliga, this article shows that the bodily sensations experienced by spectators during a visit to the stadium are synchronized with events on the pitch and with the more or less imposing scenery. The analysis of in situ diaries reveals that spectators experience a comprehensive sense of collectivity. The study presents evidence that the occurrence of these bodily sensations is strongly connected with different aspects of spatiality. This includes sensations of constriction and expansion within the body, an awareness of one’s location within the stadium, the influence of the immediate surroundings and cognitive here/there and inside/outside distinctions.
Diagnosis and treatment of breast cancer is a very emotionally aversive and stressful life event, which can lead to impaired cognitive functioning and mental health. Breast cancer survivors responding with repressive emotion regulation strategies often show less adaptive coping and adverse outcomes. We investigated cognitive functioning and neural correlates of emotion processing using ERPs. Self-report measures of depression, anxiety, and fatigue, as well as hair cortisol as an index of chronic stress, were assessed. Twenty breast cancer survivors (BCS) and 31 carefully matched healthy controls participated in the study. After neuropsychological testing and subjective assessments, participants viewed 30 neutral, 30 unpleasant, and 30 pleasant pictures, and ERPs were recorded. Recognition memory was tested 1 week later. BCS reported stronger complaints about cognitive impairments and more symptoms of depression, anxiety, and fatigue. Moreover, they showed elevated hair cortisol levels. Except for verbal memory, cognitive functioning was predominantly in the normative range. Recognition memory performance was decreased in cancer survivors, especially for emotional contents. In ERPs, survivors showed smaller late positive potential amplitudes for unpleasant pictures relative to controls in a later time window, which may indicate less elaborative processing of this material. Taken together, we found cognitive impairments in BCS in verbal memory, impaired emotional picture memory accuracy, and reduced neural activity when breast cancer survivors were confronted with unpleasant materials. Further studies and larger sample sizes, however, are needed to evaluate the relationship between altered emotion processing and reduced memory in BCS in order to develop new treatment strategies.
Emotions are a central element of human experience. They occur with high frequency in everyday life and play an important role in decision making. However, currently there is no consensus among researchers on what constitutes an emotion and on how emotions should be investigated. This dissertation identifies three problems of current emotion research: the problem of ground truth, the problem of incomplete constructs and the problem of optimal representation. I argue for a focus on the detailed measurement of emotion manifestations with computer-aided methods to solve these problems. This approach is demonstrated in three research projects, which describe the development of methods specific to these problems as well as their application to concrete research questions.
The problem of ground truth describes the practice to presuppose a certain structure of emotions as the a priori ground truth. This determines the range of emotion descriptions and sets a standard for the correct assignment of these descriptions. The first project illustrates how this problem can be circumvented with a multidimensional emotion perception paradigm which stands in contrast to the emotion recognition paradigm typically employed in emotion research. This paradigm allows to calculate an objective difficulty measure and to collect subjective difficulty ratings for the perception of emotional stimuli. Moreover, it enables the use of an arbitrary number of emotion stimuli categories as compared to the commonly used six basic emotion categories. Accordingly, we collected data from 441 participants using dynamic facial expression stimuli from 40 emotion categories. Our findings suggest an increase in emotion perception difficulty with increasing actor age and provide evidence to suggest that young adults, the elderly and men underestimate their emotion perception difficulty. While these effects were predicted from the literature, we also found unexpected and novel results. In particular, the increased difficulty on the objective difficulty measure for female actors and observers stood in contrast to reported findings. Exploratory analyses revealed low relevance of person-specific variables for the prediction of emotion perception difficulty, but highlighted the importance of a general pleasure dimension for the ease of emotion perception.
The second project targets the problem of incomplete constructs which relates to vaguely defined psychological constructs on emotion with insufficient ties to tangible manifestations. The project exemplifies how a modern data collection method such as face tracking data can be used to sharpen these constructs on the example of arousal, a long-standing but fuzzy construct in emotion research. It describes how measures of distance, speed and magnitude of acceleration can be computed from face tracking data and investigates their intercorrelations. We find moderate to strong correlations among all measures of static information on one hand and all measures of dynamic information on the other. The project then investigates how self-rated arousal is tied to these measures in 401 neurotypical individuals and 19 individuals with autism. Distance to the neutral face was predictive of arousal ratings in both groups. Lower mean arousal ratings were found for the autistic group, but no difference in correlation of the measures and arousal ratings could be found between groups. Results were replicated in a high autistic traits group consisting of 41 participants. The findings suggest a qualitatively similar perception of arousal for individuals with and without autism. No correlations between valence ratings and any of the measures could be found which emphasizes the specificity of our tested measures for the construct of arousal.
The problem of optimal representation refers to the search for the best representation of emotions and the assumption that there is a one-fits-all solution. In the third project we introduce partial least squares analysis as a general method to find an optimal representation to relate two high-dimensional data sets to each other. The project demonstrates its applicability to emotion research on the question of emotion perception differences between men and women. The method was used with emotion rating data from 441 participants and face tracking data computed on 306 videos. We found quantitative as well as qualitative differences in the perception of emotional facial expressions between these groups. We showed that women’s emotional perception systematically captured more of the variance in facial expressions. Additionally, we could show that significant differences exist in the way that women and men perceive some facial expressions which could be visualized as concrete facial expression sequences. These expressions suggest differing perceptions of masked and ambiguous facial expressions between the sexes. In order to facilitate use of the developed method by the research community, a package for the statistical environment R was written. Furthermore, to call attention to the method and its usefulness for emotion research, a website was designed that allows users to explore a model of emotion ratings and facial expression data in an interactive fashion.
Many studies have suggested that emotional stimuli orient and engage attention. There is also evidence that animate stimuli, such as those from humans and animals, cause attentional bias. However, categorical and emotional factors are usually mixed, and it is unclear to what extent human context influences attentional allocation. To address this issue, we tracked participants' eye movements while they viewed pictures with animals and inanimate images (i.e., category) as focal objects. These pictures had either negative or neutral emotional valence, and either human body parts or nonhuman parts were near the focal objects (i.e., context). The picture's valence, arousal, position, size, and most of the low-level visual features were matched across categories. The results showed that nonhuman animals were more likely to be attended to and to be attended to for longer times than inanimate objects. The same pattern held for the human contexts (vs. nonhuman contexts). The effects of emotional valence, category, and context interacted. Specifically, in images with a negative valence, focal animals and objects with human context had comparable numbers of gaze fixations and gaze duration. These results highlighted the attentional bias to animate parts of a picture and clarified that the effects of category, valence, and picture context interacted to influence attentional allocation.
During social interactions, individuals rapidly and automatically judge others’ trustworthiness on the basis of subtle facial cues. To investigate the behavioral and neural correlates of these judgments, we conducted 2 studies: 1 study for the construction and evaluation of a set of natural faces differing in trustworthiness (Study 1: n = 30) and another study for the investigation of event-related potentials (ERPs) in response to this set of natural faces (Study 2: n = 30). Participants of both studies provided highly reliable and nearly identical trustworthiness ratings for the selected faces, supporting the notion that the discrimination of trustworthy and untrustworthy faces depends on distinct facial cues. These cues appear to be processed in an automatic and bottom-up-driven fashion because the free viewing of these faces was sufficient to elicit trustworthiness-related differences in late positive potentials (LPPs) as indicated by larger amplitudes to untrustworthy as compared with trustworthy faces. Taken together, these findings suggest that natural faces contain distinct cues that are automatically and rapidly processed to facilitate the discrimination of untrustworthy and trustworthy faces across various contexts, presumably by enhancing the elaborative processing of untrustworthy as compared with trustworthy faces. (
Emotional memories are better remembered than neutral ones, but the mechanisms leading to this memory bias are not well under-stood in humans yet. Based on animal research, it is suggested that the memory-enhancing effect of emotion is based on central nor-adrenergic release, which is triggered by afferent vagal nerve activation. To test the causal link between vagus nerve activation and emotional memory in humans, we applied continuous noninvasive transcutaneous auricular vagus nerve stimulation (taVNS) during exposure to emotional arousing and neutral scenes and tested subsequent, long-term recognition memory after 1 week. We found that taVNS, compared with sham, increased recollection-based memory performance for emotional, but not neutral, material. These findings were complemented by larger recollection-related brain potentials (parietal ERP Old/New effect) during retrieval of emotional scenes encoded under taVNS, compared with sham. Furthermore, brain potentials recorded during encoding also revealed that taVNS facilitated early attentional discrimination between emotional and neutral scenes. Extending animal research, our behavioral and neu-ral findings confirm a modulatory influence of the vagus nerve in emotional memory formation in humans.
Previous research found that memory is not only better for emotional information but also for neutral information that has been encoded in the context of an emotional event. In the present ERP study, we investigated two factors that may influence memory for neutral and emotional items: temporal proximity between emotional and neutral items during encoding, and retention interval (immediate vs. delayed). Forty-nine female participants incidentally encoded 36 unpleasant and 108 neutral pictures (36 neutral pictures preceded an unpleasant picture, 36 followed an unpleasant picture, and 36 neutral pictures were preceded and followed by neutral pictures) and participated in a recognition memory task either immediately (N=24) or 1 week (N=25) after encoding. Results showed better memory for emotional pictures relative to neutral pictures. In accordance, enhanced centroparietal old/new differences (500-900 ms) during recognition were observed for unpleasant compared to neutral pictures, most pronounced for the 1-week interval. Picture position effects, however, were only subtle. During encoding, late positive potentials for neutral pictures were slightly lower for neutral pictures following unpleasant ones, but only at trend level. To summarize, we could replicate and extend previous ERP findings showing that emotionally arousing events are better recollected than neutral events, particularly when memory is tested after longer retention intervals. Picture position during encoding, however, had only small effects on elaborative processing and no effects on memory retrieval.
I Can See It in Your Face.
(2019)
The purpose of this study was to illustrate that people’s affective valuation of exercise can be identified in their faces. The study was conducted with a software for automatic facial expression analysis and it involved testing the hypothesis that positive or negative affective valuation occurs spontaneously when people are reminded of exercise. We created a task similar to an emotional Stroop task, in which participants responded to exercise-related and control stimuli with a positive or negative facial expression (smile or frown) depending on whether the photo was presented upright or tilted. We further asked participants how much time they would normally spend for physical exercise, because we assumed that the affective valuation of those who exercise more would be more positive. Based on the data of 86 participants, regression analysis revealed that those who reported less exercise and a more negative reflective evaluation of exercise initiated negative facial expressions on exercise-related stimuli significantly faster than those who reported exercising more often. No significant effect was observed for smile responses. We suspect that responding with a smile to exercise-related stimuli was the congruent response for the majority of our participants, so that for them no Stroop interference occurred in the exercise-related condition. This study suggests that immediate negative affective reactions to exercise-related stimuli result from a postconscious automatic process and can be detected in the study participants’ faces. It furthermore illustrates how methodological paradigms from social–cognition research (here: the emotional Stroop paradigm) can be adapted to collect and analyze biometric data for the investigation of exercisers’ and non-exercisers’ automatic valuations of exercise.
I Can See It in Your Face.
(2019)
The purpose of this study was to illustrate that people’s affective valuation of exercise can be identified in their faces. The study was conducted with a software for automatic facial expression analysis and it involved testing the hypothesis that positive or negative affective valuation occurs spontaneously when people are reminded of exercise. We created a task similar to an emotional Stroop task, in which participants responded to exercise-related and control stimuli with a positive or negative facial expression (smile or frown) depending on whether the photo was presented upright or tilted. We further asked participants how much time they would normally spend for physical exercise, because we assumed that the affective valuation of those who exercise more would be more positive. Based on the data of 86 participants, regression analysis revealed that those who reported less exercise and a more negative reflective evaluation of exercise initiated negative facial expressions on exercise-related stimuli significantly faster than those who reported exercising more often. No significant effect was observed for smile responses. We suspect that responding with a smile to exercise-related stimuli was the congruent response for the majority of our participants, so that for them no Stroop interference occurred in the exercise-related condition. This study suggests that immediate negative affective reactions to exercise-related stimuli result from a postconscious automatic process and can be detected in the study participants’ faces. It furthermore illustrates how methodological paradigms from social–cognition research (here: the emotional Stroop paradigm) can be adapted to collect and analyze biometric data for the investigation of exercisers’ and non-exercisers’ automatic valuations of exercise.