• Treffer 7 von 10
Zurück zur Trefferliste

Quasi-compositional mapping from form to meaning

  • We argue that natural language can be usefully described as quasi-compositional and we suggest that deep learning-based neural language models bear long-term promise to capture how language conveys meaning. We also note that a successful account of human language processing should explain both the outcome of the comprehension process and the continuous internal processes underlying this performance. These points motivate our discussion of a neural network model of sentence comprehension, the Sentence Gestalt model, which we have used to account for the N400 component of the event-related brain potential (ERP), which tracks meaning processing as it happens in real time. The model, which shares features with recent deep learning-based language models, simulates N400 amplitude as the automatic update of a probabilistic representation of the situation or event described by the sentence, corresponding to a temporal difference learning signal at the level of meaning. We suggest that this process happens relatively automatically, and thatWe argue that natural language can be usefully described as quasi-compositional and we suggest that deep learning-based neural language models bear long-term promise to capture how language conveys meaning. We also note that a successful account of human language processing should explain both the outcome of the comprehension process and the continuous internal processes underlying this performance. These points motivate our discussion of a neural network model of sentence comprehension, the Sentence Gestalt model, which we have used to account for the N400 component of the event-related brain potential (ERP), which tracks meaning processing as it happens in real time. The model, which shares features with recent deep learning-based language models, simulates N400 amplitude as the automatic update of a probabilistic representation of the situation or event described by the sentence, corresponding to a temporal difference learning signal at the level of meaning. We suggest that this process happens relatively automatically, and that sometimes a more-controlled attention-dependent process is necessary for successful comprehension, which may be reflected in the subsequent P600 ERP component. We relate this account to current deep learning models as well as classic linguistic theory, and use it to illustrate a domain general perspective on some specific linguistic operations postulated based on compositional analyses of natural language. This article is part of the theme issue 'Towards mechanistic models of meaning composition'.zeige mehrzeige weniger

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Verfasserangaben:Milena RabovskyORCiD, James L. McClellandORCiD
DOI:https://doi.org/10.1098/rstb.2019.0313
ISSN:0962-8436
ISSN:1471-2970
ISSN:0080-4622
Pubmed ID:https://pubmed.ncbi.nlm.nih.gov/31840583
Titel des übergeordneten Werks (Englisch):Philosophical transactions of the Royal Society of London : B, Biological sciences
Untertitel (Englisch):a neural network-based approach to capturing neural responses during human language comprehension
Verlag:Royal Society
Verlagsort:London
Publikationstyp:Wissenschaftlicher Artikel
Sprache:Englisch
Datum der Erstveröffentlichung:16.12.2019
Erscheinungsjahr:2020
Datum der Freischaltung:22.03.2023
Freies Schlagwort / Tag:N400; P600; event-related brain potentials; language; meaning; neural networks
Band:375
Ausgabe:1791
Aufsatznummer:20190313
Seitenanzahl:9
Fördernde Institution:Emmy Noether grant from the German Research Foundation (DFG)German; Research Foundation (DFG) [RA 2715/2-1]
Organisationseinheiten:Humanwissenschaftliche Fakultät / Strukturbereich Kognitionswissenschaften / Department Psychologie
DDC-Klassifikation:1 Philosophie und Psychologie / 15 Psychologie / 150 Psychologie
Peer Review:Referiert
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.