Refine
Has Fulltext
- no (21) (remove)
Year of publication
Document Type
- Article (21) (remove)
Language
- English (21)
Is part of the Bibliography
- yes (21)
Keywords
- attention (21) (remove)
During reading, rapid eye movements (saccades) shift the reader's line of sight from one word to another for high-acuity visual information processing. While experimental data and theoretical models show that readers aim at word centers, the eye-movement (oculomotor) accuracy is low compared to other tasks. As a consequence, distributions of saccadic landing positions indicate large (i) random errors and (ii) systematic over- and undershoot of word centers, which additionally depend on saccade lengths (McConkie et al.Visual Research, 28(10), 1107-1118,1988). Here we show that both error components can be simultaneously reduced by reading texts from right to left in German language (N= 32). We used our experimental data to test a Bayesian model of saccade planning. First, experimental data are consistent with the model. Second, the model makes specific predictions of the effects of the precision of prior and (sensory) likelihood. Our results suggest that it is a more precise sensory likelihood that can explain the reduction of both random and systematic error components.
The interplay between cognitive and oculomotor processes during reading can be explored when the spatial layout of text deviates from the typical display. In this study, we investigate various eye-movement measures during reading of text with experimentally manipulated layout (word-wise and letter-wise mirrored-reversed text as well as inverted and scrambled text). While typical findings (e.g., longer mean fixation times, shorter mean saccades lengths) in reading manipulated texts compared to normal texts were reported in earlier work, little is known about changes of oculomotor targeting observed in within-word landing positions under the above text layouts. Here we carry out precise analyses of landing positions and find substantial changes in the so-called launch-site effect in addition to the expected overall slow-down of reading performance. Specifically, during reading of our manipulated text conditions with reversed letter order (against overall reading direction), we find a reduced launch-site effect, while in all other manipulated text conditions, we observe an increased launch-site effect. Our results clearly indicate that the oculomotor system is highly adaptive when confronted with unusual reading conditions.
Diagnosis and treatment of breast cancer is a very emotionally aversive and stressful life event, which can lead to impaired cognitive functioning and mental health. Breast cancer survivors responding with repressive emotion regulation strategies often show less adaptive coping and adverse outcomes. We investigated cognitive functioning and neural correlates of emotion processing using ERPs. Self-report measures of depression, anxiety, and fatigue, as well as hair cortisol as an index of chronic stress, were assessed. Twenty breast cancer survivors (BCS) and 31 carefully matched healthy controls participated in the study. After neuropsychological testing and subjective assessments, participants viewed 30 neutral, 30 unpleasant, and 30 pleasant pictures, and ERPs were recorded. Recognition memory was tested 1 week later. BCS reported stronger complaints about cognitive impairments and more symptoms of depression, anxiety, and fatigue. Moreover, they showed elevated hair cortisol levels. Except for verbal memory, cognitive functioning was predominantly in the normative range. Recognition memory performance was decreased in cancer survivors, especially for emotional contents. In ERPs, survivors showed smaller late positive potential amplitudes for unpleasant pictures relative to controls in a later time window, which may indicate less elaborative processing of this material. Taken together, we found cognitive impairments in BCS in verbal memory, impaired emotional picture memory accuracy, and reduced neural activity when breast cancer survivors were confronted with unpleasant materials. Further studies and larger sample sizes, however, are needed to evaluate the relationship between altered emotion processing and reduced memory in BCS in order to develop new treatment strategies.
Coupling of attention and saccades when viewing scenes with central and peripheral degradation
(2016)
Degrading real-world scenes in the central or the peripheral visual field yields a characteristic pattern: Mean saccade amplitudes increase with central and decrease with peripheral degradation. Does this pattern reflect corresponding modulations of selective attention? If so, the observed saccade amplitude pattern should reflect more focused attention in the central region with peripheral degradation and an attentional bias toward the periphery with central degradation. To investigate this hypothesis, we measured the detectability of peripheral (Experiment 1) or central targets (Experiment 2) during scene viewing when low or high spatial frequencies were gaze-contingently filtered in the central or the peripheral visual field. Relative to an unfiltered control condition, peripheral filtering induced a decrease of the detection probability for peripheral but not for central targets (tunnel vision). Central filtering decreased the detectability of central but not of peripheral targets. Additional post hoc analyses are compatible with the interpretation that saccade amplitudes and direction are computed in partial independence. Our experimental results indicate that task-induced modulations of saccade amplitudes reflect attentional modulations.
Coupling of attention and saccades when viewing scenes with central and peripheral degradation
(2016)
Degrading real-world scenes in the central or the peripheral visual field yields a characteristic pattern: Mean saccade amplitudes increase with central and decrease with peripheral degradation. Does this pattern reflect corresponding modulations of selective attention? If so, the observed saccade amplitude pattern should reflect more focused attention in the central region with peripheral degradation and an attentional bias toward the periphery with central degradation. To investigate this hypothesis, we measured the detectability of peripheral (Experiment 1) or central targets (Experiment 2) during scene viewing when low or high spatial frequencies were gaze-contingently filtered in the central or the peripheral visual field. Relative to an unfiltered control condition, peripheral filtering induced a decrease of the detection probability for peripheral but not for central targets (tunnel vision). Central filtering decreased the detectability of central but not of peripheral targets. Additional post hoc analyses are compatible with the interpretation that saccade amplitudes and direction are computed in partial independence. Our experimental results indicate that task-induced modulations of saccade amplitudes reflect attentional modulations.
In humans and in foveated animals visual acuity is highly concentrated at the center of gaze, so that choosing where to look next is an important example of online, rapid decision-making. Computational neuroscientists have developed biologically-inspired models of visual attention, termed saliency maps, which successfully predict where people fixate on average. Using point process theory for spatial statistics, we show that scanpaths contain, however, important statistical structure, such as spatial clustering on top of distributions of gaze positions. Here, we develop a dynamical model of saccadic selection that accurately predicts the distribution of gaze positions as well as spatial clustering along individual scanpaths. Our model relies on activation dynamics via spatially-limited (foveated) access to saliency information, and, second, a leaky memory process controlling the re-inspection of target regions. This theoretical framework models a form of context-dependent decision-making, linking neural dynamics of attention to behavioral gaze data.
In addition to sensory decline, age-related losses in auditory perception also reflect impairments in attentional modulation of perceptual saliency. Using an attention and intensity-modulated dichotic listening paradigm, we investigated electrophysiological correlates of processing conflicts between attentional focus and perceptual saliency in 25 younger and 26 older adults. Participants were instructed to attend to the right or left ear, and perceptual saliency was manipulated by varying the intensities of both ears. Attentional control demand was higher in conditions when attentional focus and perceptual saliency favored opposing ears than in conditions without such conflicts. Relative to younger adults, older adults modulated their attention less flexibly and were more influenced by perceptual saliency. Our results show, for the first time, that in younger adults a late negativity in the event-related potential (ERP) at fronto-central and parietal electrodes was sensitive to perceptual-attentional conflicts during auditory processing (N450 modulation effect). Crucially, the magnitude of the N450 modulation effect correlated positively with task performance. In line with lower attentional flexibility, the ERP waveforms of older adults showed absence of the late negativity and the modulation effect. This suggests that aging compromises the activation of the frontoparietal attentional network when processing the competing and conflicting auditory information.
Older adults often experience hearing difficulties in multitalker situations. Attentional control of auditory perception is crucial in situations where a plethora of auditory inputs compete for further processing. We combined an intensity-modulated dichotic listening paradigm with attentional manipulations to study adult age differences in the interplay between perceptual saliency and attentional control of auditory processing. When confronted with two competing sources of verbal auditory input, older adults modulated their attention less flexibly and were more driven by perceptual saliency than younger adults. These findings suggest that aging severely impairs the attentional regulation of auditory perception.
Many studies have suggested that emotional stimuli orient and engage attention. There is also evidence that animate stimuli, such as those from humans and animals, cause attentional bias. However, categorical and emotional factors are usually mixed, and it is unclear to what extent human context influences attentional allocation. To address this issue, we tracked participants' eye movements while they viewed pictures with animals and inanimate images (i.e., category) as focal objects. These pictures had either negative or neutral emotional valence, and either human body parts or nonhuman parts were near the focal objects (i.e., context). The picture's valence, arousal, position, size, and most of the low-level visual features were matched across categories. The results showed that nonhuman animals were more likely to be attended to and to be attended to for longer times than inanimate objects. The same pattern held for the human contexts (vs. nonhuman contexts). The effects of emotional valence, category, and context interacted. Specifically, in images with a negative valence, focal animals and objects with human context had comparable numbers of gaze fixations and gaze duration. These results highlighted the attentional bias to animate parts of a picture and clarified that the effects of category, valence, and picture context interacted to influence attentional allocation.