• Treffer 4 von 4
Zurück zur Trefferliste

Microsaccadic responses indicate fast categorization of sounds: A novel approach to study auditory cognition

  • The mental chronometry of the human brain's processing of sounds to be categorized as targets has intensively been studied in cognitive neuroscience. According to current theories, a series of successive stages consisting of the registration, identification, and categorization of the sound has to be completed before participants are able to report the sound as a target by button press after similar to 300-500 ms. Here we use miniature eye movements as a tool to study the categorization of a sound as a target or nontarget, indicating that an initial categorization is present already after 80-100 ms. During visual fixation, the rate of microsaccades, the fastest components of miniature eye movements, is transiently modulated after auditory stimulation. In two experiments, we measured microsaccade rates in human participants in an auditory three-tone oddball paradigm (including rare nontarget sounds) and observed a difference in the microsaccade rates between targets and nontargets as early as 142 ms after sound onset. This finding wasThe mental chronometry of the human brain's processing of sounds to be categorized as targets has intensively been studied in cognitive neuroscience. According to current theories, a series of successive stages consisting of the registration, identification, and categorization of the sound has to be completed before participants are able to report the sound as a target by button press after similar to 300-500 ms. Here we use miniature eye movements as a tool to study the categorization of a sound as a target or nontarget, indicating that an initial categorization is present already after 80-100 ms. During visual fixation, the rate of microsaccades, the fastest components of miniature eye movements, is transiently modulated after auditory stimulation. In two experiments, we measured microsaccade rates in human participants in an auditory three-tone oddball paradigm (including rare nontarget sounds) and observed a difference in the microsaccade rates between targets and nontargets as early as 142 ms after sound onset. This finding was replicated in a third experiment with directed saccades measured in a paradigm in which tones had to be matched to score-like visual symbols. Considering the delays introduced by (motor) signal transmission and data analysis constraints, the brain must have differentiated target from nontarget sounds as fast as 80-100 ms after sound onset in both paradigms. We suggest that predictive information processing for expected input makes higher cognitive attributes, such as a sound's identity and category, available already during early sensory processing. The measurement of eye movements is thus a promising approach to investigate hearing.zeige mehrzeige weniger

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Verfasserangaben:Andreas Widmann, Ralf EngbertORCiDGND, Erich Schroeger
DOI:https://doi.org/10.1523/JNEUROSCI.1568-14.2014
ISSN:0270-6474
Pubmed ID:https://pubmed.ncbi.nlm.nih.gov/25122911
Titel des übergeordneten Werks (Englisch):The journal of neuroscience
Verlag:Society for Neuroscience
Verlagsort:Washington
Publikationstyp:Wissenschaftlicher Artikel
Sprache:Englisch
Jahr der Erstveröffentlichung:2014
Erscheinungsjahr:2014
Datum der Freischaltung:27.03.2017
Freies Schlagwort / Tag:audition; categorization; mental chronometry; microsaccade
Band:34
Ausgabe:33
Seitenanzahl:7
Erste Seite:11152
Letzte Seite:11158
Fördernde Institution:Deutsche Forschungsgemeinschaft (DFG) Reinhart-Koselleck grant [SCHR 375/20-1]; DFG [EN 471/3]
Organisationseinheiten:Humanwissenschaftliche Fakultät / Strukturbereich Kognitionswissenschaften / Department Psychologie
Peer Review:Referiert
Name der Einrichtung zum Zeitpunkt der Publikation:Humanwissenschaftliche Fakultät / Institut für Psychologie
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.