Das Suchergebnis hat sich seit Ihrer Suchanfrage verändert. Eventuell werden Dokumente in anderer Reihenfolge angezeigt.
  • Treffer 10 von 13
Zurück zur Trefferliste

Utilizing a pretrained language model (BERT) to classify preservice physics teachers' written reflections

  • Computer-based analysis of preservice teachers' written reflections could enable educational scholars to design personalized and scalable intervention measures to support reflective writing. Algorithms and technologies in the domain of research related to artificial intelligence have been found to be useful in many tasks related to reflective writing analytics such as classification of text segments. However, mostly shallow learning algorithms have been employed so far. This study explores to what extent deep learning approaches can improve classification performance for segments of written reflections. To do so, a pretrained language model (BERT) was utilized to classify segments of preservice physics teachers' written reflections according to elements in a reflection-supporting model. Since BERT has been found to advance performance in many tasks, it was hypothesized to enhance classification performance for written reflections as well. We also compared the performance of BERT with other deep learning architectures and examinedComputer-based analysis of preservice teachers' written reflections could enable educational scholars to design personalized and scalable intervention measures to support reflective writing. Algorithms and technologies in the domain of research related to artificial intelligence have been found to be useful in many tasks related to reflective writing analytics such as classification of text segments. However, mostly shallow learning algorithms have been employed so far. This study explores to what extent deep learning approaches can improve classification performance for segments of written reflections. To do so, a pretrained language model (BERT) was utilized to classify segments of preservice physics teachers' written reflections according to elements in a reflection-supporting model. Since BERT has been found to advance performance in many tasks, it was hypothesized to enhance classification performance for written reflections as well. We also compared the performance of BERT with other deep learning architectures and examined conditions for best performance. We found that BERT outperformed the other deep learning architectures and previously reported performances with shallow learning algorithms for classification of segments of reflective writing. BERT starts to outperform the other models when trained on about 20 to 30% of the training data. Furthermore, attribution analyses for inputs yielded insights into important features for BERT's classification decisions. Our study indicates that pretrained language models such as BERT can boost performance for language-related tasks in educational contexts such as classification.zeige mehrzeige weniger

Metadaten exportieren

Weitere Dienste

Suche bei Google Scholar Statistik - Anzahl der Zugriffe auf das Dokument
Metadaten
Verfasserangaben:Peter WulffORCiD, Lukas MientusORCiDGND, Anna NowakORCiDGND, Andreas BorowskiORCiDGND
DOI:https://doi.org/10.1007/s40593-022-00290-6
ISSN:1560-4292
ISSN:1560-4306
Titel des übergeordneten Werks (Englisch):International journal of artificial intelligence in education
Verlag:Springer
Verlagsort:New York
Publikationstyp:Wissenschaftlicher Artikel
Sprache:Englisch
Datum der Erstveröffentlichung:02.05.2022
Erscheinungsjahr:2022
Datum der Freischaltung:07.12.2023
Freies Schlagwort / Tag:Deep learning; NLP; Reflective writing; Science education
Ausgabe:33
Seitenanzahl:28
Erste Seite:439
Letzte Seite:466
Organisationseinheiten:Mathematisch-Naturwissenschaftliche Fakultät / Institut für Physik und Astronomie
DDC-Klassifikation:5 Naturwissenschaften und Mathematik / 53 Physik / 530 Physik
Peer Review:Referiert
Publikationsweg:Open Access / Hybrid Open-Access
Lizenz (Deutsch):License LogoCC-BY - Namensnennung 4.0 International
Verstanden ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.