The search result changed since you submitted your search request. Documents might be displayed in a different sort order.
  • search hit 2 of 7
Back to Result List

Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking

  • Real-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the studyReal-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the study of fine-scaled task dependencies in an experimental setting that permits more natural viewing behavior than the static picture viewing paradigm.show moreshow less

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Daniel BackhausORCiD, Ralf EngbertORCiDGND, Lars Oliver Martin RothkegelORCiDGND, Hans Arne TrukenbrodORCiD
DOI:https://doi.org/10.1167/jov.20.5.3
ISSN:1534-7362
Pubmed ID:https://pubmed.ncbi.nlm.nih.gov/32392286
Title of parent work (English):Journal of vision
Publisher:Association for Research in Vision and Opthalmology
Place of publishing:Rockville
Publication type:Article
Language:English
Date of first publication:2020/05/11
Publication year:2020
Release date:2022/11/30
Tag:central fixation bias; influence; mobile eye-tracking; real-world scenarios; scene viewing; task
Volume:20
Issue:5
Article number:3
Number of pages:21
First page:1
Last Page:21
Funding institution:Deutsche ForschungsgemeinschaftGerman Research Foundation (DFG) [TR; 1385/2-1]
Organizational units:Humanwissenschaftliche Fakultät / Strukturbereich Kognitionswissenschaften / Department Psychologie
DDC classification:1 Philosophie und Psychologie / 15 Psychologie / 150 Psychologie
6 Technik, Medizin, angewandte Wissenschaften / 61 Medizin und Gesundheit / 610 Medizin und Gesundheit
Peer review:Referiert
Publishing method:Open Access / Gold Open-Access
DOAJ gelistet
License (German):License LogoCC-BY - Namensnennung 4.0 International
External remark:Zweitveröffentlichung in der Schriftenreihe Zweitveröffentlichungen der Universität Potsdam : Humanwissenschaftliche Reihe ; 871
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.