• Treffer 3 von 3
Zurück zur Trefferliste

Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking

  • Real-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the studyReal-world scene perception is typically studied in the laboratory using static picture viewing with restrained head position. Consequently, the transfer of results obtained in this paradigm to real-word scenarios has been questioned. The advancement of mobile eye-trackers and the progress in image processing, however, permit a more natural experimental setup that, at the same time, maintains the high experimental control from the standard laboratory setting. We investigated eye movements while participants were standing in front of a projector screen and explored images under four specific task instructions. Eye movements were recorded with a mobile eye-tracking device and raw gaze data were transformed from head-centered into image-centered coordinates. We observed differences between tasks in temporal and spatial eye-movement parameters and found that the bias to fixate images near the center differed between tasks. Our results demonstrate that current mobile eye-tracking technology and a highly controlled design support the study of fine-scaled task dependencies in an experimental setting that permits more natural viewing behavior than the static picture viewing paradigm.zeige mehrzeige weniger

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Verfasserangaben:Daniel BackhausORCiD, Ralf EngbertORCiDGND, Lars Oliver Martin RothkegelORCiDGND, Hans Arne TrukenbrodORCiD
DOI:https://doi.org/10.1167/jov.20.5.3
ISSN:1534-7362
Pubmed ID:https://pubmed.ncbi.nlm.nih.gov/32392286
Titel des übergeordneten Werks (Englisch):Journal of vision
Verlag:Association for Research in Vision and Opthalmology
Verlagsort:Rockville
Publikationstyp:Wissenschaftlicher Artikel
Sprache:Englisch
Datum der Erstveröffentlichung:11.05.2020
Erscheinungsjahr:2020
Datum der Freischaltung:30.11.2022
Freies Schlagwort / Tag:central fixation bias; influence; mobile eye-tracking; real-world scenarios; scene viewing; task
Band:20
Ausgabe:5
Aufsatznummer:3
Seitenanzahl:21
Erste Seite:1
Letzte Seite:21
Fördernde Institution:Deutsche ForschungsgemeinschaftGerman Research Foundation (DFG) [TR; 1385/2-1]
Organisationseinheiten:Humanwissenschaftliche Fakultät / Strukturbereich Kognitionswissenschaften / Department Psychologie
DDC-Klassifikation:1 Philosophie und Psychologie / 15 Psychologie / 150 Psychologie
6 Technik, Medizin, angewandte Wissenschaften / 61 Medizin und Gesundheit / 610 Medizin und Gesundheit
Peer Review:Referiert
Publikationsweg:Open Access / Gold Open-Access
DOAJ gelistet
Lizenz (Deutsch):License LogoCC-BY - Namensnennung 4.0 International
Externe Anmerkung:Zweitveröffentlichung in der Schriftenreihe Zweitveröffentlichungen der Universität Potsdam : Humanwissenschaftliche Reihe ; 871
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.