• search hit 2 of 4
Back to Result List

Prosodic boundaries delay the processing of upcoming lexical information during silent sentence reading

  • Prosodic boundaries can be used to guide syntactic parsing in both spoken and written sentence comprehension, but it is unknown whether the processing of prosodic boundaries affects the processing of upcoming lexical information. In 3 eye-tracking experiments, participants read silently sentences that allow for 2 possible syntactic interpretations when there is no comma or other cue specifying which interpretation should be taken. In Experiments 1 and 2, participants heard a low-pass filtered auditory version of the sentence, which provided a prosodic boundary cue prior to each sentence. In Experiment 1, we found that the boundary cue helped syntactic disambiguation after the cue and led to longer fixation durations on regions right before the cue than on identical regions without prosodic boundary information. In Experiments 2 and 3, we used a gaze-contingent display-change paradigm to manipulate the parafoveal visibility of the first constituent character of the target word after the disambiguating position. Results of Experiment 2Prosodic boundaries can be used to guide syntactic parsing in both spoken and written sentence comprehension, but it is unknown whether the processing of prosodic boundaries affects the processing of upcoming lexical information. In 3 eye-tracking experiments, participants read silently sentences that allow for 2 possible syntactic interpretations when there is no comma or other cue specifying which interpretation should be taken. In Experiments 1 and 2, participants heard a low-pass filtered auditory version of the sentence, which provided a prosodic boundary cue prior to each sentence. In Experiment 1, we found that the boundary cue helped syntactic disambiguation after the cue and led to longer fixation durations on regions right before the cue than on identical regions without prosodic boundary information. In Experiments 2 and 3, we used a gaze-contingent display-change paradigm to manipulate the parafoveal visibility of the first constituent character of the target word after the disambiguating position. Results of Experiment 2 showed that previewing the first character significantly reduced the reading time of the target word, but this preview benefit was greatly reduced when the prosodic boundary cue was introduced at this position. In Experiment 3, instead of the acoustic cues, a visually presented comma was inserted at the disambiguating position in each sentence. Results showed that the comma effect on lexical processing was essentially the same as the effect of prosodic boundary cue. These findings demonstrate that processing a prosodic boundary could impair the processing of parafoveal information during sentence reading.show moreshow less

Export metadata

Additional Services

Search Google Scholar Statistics
Metadaten
Author details:Yingyi Luo, Ming YanORCiDGND, Xiaolin Zhou
DOI:https://doi.org/10.1037/a0029182
ISSN:0278-7393
ISSN:1939-1285
Title of parent work (English):Journal of experimental psychology : Learning, memory, and cognition
Publisher:American Psychological Association
Place of publishing:Washington
Publication type:Article
Language:English
Year of first publication:2013
Publication year:2013
Release date:2017/03/26
Tag:eye movements; parafoveal processing; prosodic boundary; sentence reading; wrap-up process
Volume:39
Issue:3
Number of pages:16
First page:915
Last Page:930
Funding institution:Natural Science Foundation of China [30970889]; Ministry of Science and Technology of China [2010CB833904]; China Postdoctoral Science Foundation [20080440008, 200902025]; National Social Science Foundation of China [10CYY037]
Organizational units:Humanwissenschaftliche Fakultät / Strukturbereich Kognitionswissenschaften / Department Psychologie
Peer review:Referiert
Institution name at the time of the publication:Humanwissenschaftliche Fakultät / Institut für Psychologie
Accept ✔
This website uses technically necessary session cookies. By continuing to use the website, you agree to this. You can find our privacy policy here.