Refine
Has Fulltext
- no (7)
Document Type
- Article (7) (remove)
Language
- English (7)
Is part of the Bibliography
- yes (7)
Keywords
- eye movements (3)
- Computational models (1)
- Eye movements (1)
- Human behaviour (1)
- Inhibition of return (1)
- Saliency (1)
- Visual attention (1)
- Visual scanpath (1)
- Visual system (1)
- central fixation bias (1)
- dynamic models (1)
- dynamical model (1)
- fixations (1)
- influence (1)
- likelihood (1)
- mobile eye-tracking (1)
- model comparison (1)
- model fitting (1)
- natural scenes (1)
- real-world scenarios (1)
- saliency (1)
- scene viewing (1)
- task (1)
- visual attention (1)
- visual scanpath (1)
- visual search (1)
Institute
Real-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the study of fine-scaled task dependencies in an experimental setting that permits more natural viewing behavior than the static picture viewing paradigm.