Refine
Has Fulltext
- no (21)
Year of publication
Document Type
- Article (21) (remove)
Language
- English (21)
Is part of the Bibliography
- yes (21) (remove)
Keywords
- attention (21) (remove)
Many studies have suggested that emotional stimuli orient and engage attention. There is also evidence that animate stimuli, such as those from humans and animals, cause attentional bias. However, categorical and emotional factors are usually mixed, and it is unclear to what extent human context influences attentional allocation. To address this issue, we tracked participants' eye movements while they viewed pictures with animals and inanimate images (i.e., category) as focal objects. These pictures had either negative or neutral emotional valence, and either human body parts or nonhuman parts were near the focal objects (i.e., context). The picture's valence, arousal, position, size, and most of the low-level visual features were matched across categories. The results showed that nonhuman animals were more likely to be attended to and to be attended to for longer times than inanimate objects. The same pattern held for the human contexts (vs. nonhuman contexts). The effects of emotional valence, category, and context interacted. Specifically, in images with a negative valence, focal animals and objects with human context had comparable numbers of gaze fixations and gaze duration. These results highlighted the attentional bias to animate parts of a picture and clarified that the effects of category, valence, and picture context interacted to influence attentional allocation.
Older adults often experience hearing difficulties in multitalker situations. Attentional control of auditory perception is crucial in situations where a plethora of auditory inputs compete for further processing. We combined an intensity-modulated dichotic listening paradigm with attentional manipulations to study adult age differences in the interplay between perceptual saliency and attentional control of auditory processing. When confronted with two competing sources of verbal auditory input, older adults modulated their attention less flexibly and were more driven by perceptual saliency than younger adults. These findings suggest that aging severely impairs the attentional regulation of auditory perception.
In addition to sensory decline, age-related losses in auditory perception also reflect impairments in attentional modulation of perceptual saliency. Using an attention and intensity-modulated dichotic listening paradigm, we investigated electrophysiological correlates of processing conflicts between attentional focus and perceptual saliency in 25 younger and 26 older adults. Participants were instructed to attend to the right or left ear, and perceptual saliency was manipulated by varying the intensities of both ears. Attentional control demand was higher in conditions when attentional focus and perceptual saliency favored opposing ears than in conditions without such conflicts. Relative to younger adults, older adults modulated their attention less flexibly and were more influenced by perceptual saliency. Our results show, for the first time, that in younger adults a late negativity in the event-related potential (ERP) at fronto-central and parietal electrodes was sensitive to perceptual-attentional conflicts during auditory processing (N450 modulation effect). Crucially, the magnitude of the N450 modulation effect correlated positively with task performance. In line with lower attentional flexibility, the ERP waveforms of older adults showed absence of the late negativity and the modulation effect. This suggests that aging compromises the activation of the frontoparietal attentional network when processing the competing and conflicting auditory information.
In humans and in foveated animals visual acuity is highly concentrated at the center of gaze, so that choosing where to look next is an important example of online, rapid decision-making. Computational neuroscientists have developed biologically-inspired models of visual attention, termed saliency maps, which successfully predict where people fixate on average. Using point process theory for spatial statistics, we show that scanpaths contain, however, important statistical structure, such as spatial clustering on top of distributions of gaze positions. Here, we develop a dynamical model of saccadic selection that accurately predicts the distribution of gaze positions as well as spatial clustering along individual scanpaths. Our model relies on activation dynamics via spatially-limited (foveated) access to saliency information, and, second, a leaky memory process controlling the re-inspection of target regions. This theoretical framework models a form of context-dependent decision-making, linking neural dynamics of attention to behavioral gaze data.
The interruption of learning processes by breaks filled with diverse activities is common in everyday life. We investigated the effects of active computer gaming and passive relaxation (rest and music) breaks on working memory performance. Young adults were exposed to breaks involving (i) eyes-open resting, (ii) listening to music and (iii) playing the video game “Angry Birds” before performing the n-back working memory task. Based on linear mixed-effects modeling, we found that playing the “Angry Birds” video game during a short learning break led to a decline in task performance over the course of the task as compared to eyes-open resting and listening to music, although overall task performance was not impaired. This effect was associated with high levels of daily mind wandering and low self-reported ability to concentrate. These findings indicate that video games can negatively affect working memory performance over time when played in between learning tasks. We suggest further investigation of these effects because of their relevance to everyday activity.
Coupling of attention and saccades when viewing scenes with central and peripheral degradation
(2016)
Degrading real-world scenes in the central or the peripheral visual field yields a characteristic pattern: Mean saccade amplitudes increase with central and decrease with peripheral degradation. Does this pattern reflect corresponding modulations of selective attention? If so, the observed saccade amplitude pattern should reflect more focused attention in the central region with peripheral degradation and an attentional bias toward the periphery with central degradation. To investigate this hypothesis, we measured the detectability of peripheral (Experiment 1) or central targets (Experiment 2) during scene viewing when low or high spatial frequencies were gaze-contingently filtered in the central or the peripheral visual field. Relative to an unfiltered control condition, peripheral filtering induced a decrease of the detection probability for peripheral but not for central targets (tunnel vision). Central filtering decreased the detectability of central but not of peripheral targets. Additional post hoc analyses are compatible with the interpretation that saccade amplitudes and direction are computed in partial independence. Our experimental results indicate that task-induced modulations of saccade amplitudes reflect attentional modulations.
Coupling of attention and saccades when viewing scenes with central and peripheral degradation
(2016)
Degrading real-world scenes in the central or the peripheral visual field yields a characteristic pattern: Mean saccade amplitudes increase with central and decrease with peripheral degradation. Does this pattern reflect corresponding modulations of selective attention? If so, the observed saccade amplitude pattern should reflect more focused attention in the central region with peripheral degradation and an attentional bias toward the periphery with central degradation. To investigate this hypothesis, we measured the detectability of peripheral (Experiment 1) or central targets (Experiment 2) during scene viewing when low or high spatial frequencies were gaze-contingently filtered in the central or the peripheral visual field. Relative to an unfiltered control condition, peripheral filtering induced a decrease of the detection probability for peripheral but not for central targets (tunnel vision). Central filtering decreased the detectability of central but not of peripheral targets. Additional post hoc analyses are compatible with the interpretation that saccade amplitudes and direction are computed in partial independence. Our experimental results indicate that task-induced modulations of saccade amplitudes reflect attentional modulations.
Diagnosis and treatment of breast cancer is a very emotionally aversive and stressful life event, which can lead to impaired cognitive functioning and mental health. Breast cancer survivors responding with repressive emotion regulation strategies often show less adaptive coping and adverse outcomes. We investigated cognitive functioning and neural correlates of emotion processing using ERPs. Self-report measures of depression, anxiety, and fatigue, as well as hair cortisol as an index of chronic stress, were assessed. Twenty breast cancer survivors (BCS) and 31 carefully matched healthy controls participated in the study. After neuropsychological testing and subjective assessments, participants viewed 30 neutral, 30 unpleasant, and 30 pleasant pictures, and ERPs were recorded. Recognition memory was tested 1 week later. BCS reported stronger complaints about cognitive impairments and more symptoms of depression, anxiety, and fatigue. Moreover, they showed elevated hair cortisol levels. Except for verbal memory, cognitive functioning was predominantly in the normative range. Recognition memory performance was decreased in cancer survivors, especially for emotional contents. In ERPs, survivors showed smaller late positive potential amplitudes for unpleasant pictures relative to controls in a later time window, which may indicate less elaborative processing of this material. Taken together, we found cognitive impairments in BCS in verbal memory, impaired emotional picture memory accuracy, and reduced neural activity when breast cancer survivors were confronted with unpleasant materials. Further studies and larger sample sizes, however, are needed to evaluate the relationship between altered emotion processing and reduced memory in BCS in order to develop new treatment strategies.
Previous research found that memory is not only better for emotional information but also for neutral information that has been encoded in the context of an emotional event. In the present ERP study, we investigated two factors that may influence memory for neutral and emotional items: temporal proximity between emotional and neutral items during encoding, and retention interval (immediate vs. delayed). Forty-nine female participants incidentally encoded 36 unpleasant and 108 neutral pictures (36 neutral pictures preceded an unpleasant picture, 36 followed an unpleasant picture, and 36 neutral pictures were preceded and followed by neutral pictures) and participated in a recognition memory task either immediately (N=24) or 1 week (N=25) after encoding. Results showed better memory for emotional pictures relative to neutral pictures. In accordance, enhanced centroparietal old/new differences (500-900 ms) during recognition were observed for unpleasant compared to neutral pictures, most pronounced for the 1-week interval. Picture position effects, however, were only subtle. During encoding, late positive potentials for neutral pictures were slightly lower for neutral pictures following unpleasant ones, but only at trend level. To summarize, we could replicate and extend previous ERP findings showing that emotionally arousing events are better recollected than neutral events, particularly when memory is tested after longer retention intervals. Picture position during encoding, however, had only small effects on elaborative processing and no effects on memory retrieval.