TY - JOUR A1 - Wulff, Peter A1 - Mientus, Lukas A1 - Nowak, Anna A1 - Borowski, Andreas T1 - Utilizing a pretrained language model (BERT) to classify preservice physics teachers' written reflections JF - International journal of artificial intelligence in education N2 - Computer-based analysis of preservice teachers' written reflections could enable educational scholars to design personalized and scalable intervention measures to support reflective writing. Algorithms and technologies in the domain of research related to artificial intelligence have been found to be useful in many tasks related to reflective writing analytics such as classification of text segments. However, mostly shallow learning algorithms have been employed so far. This study explores to what extent deep learning approaches can improve classification performance for segments of written reflections. To do so, a pretrained language model (BERT) was utilized to classify segments of preservice physics teachers' written reflections according to elements in a reflection-supporting model. Since BERT has been found to advance performance in many tasks, it was hypothesized to enhance classification performance for written reflections as well. We also compared the performance of BERT with other deep learning architectures and examined conditions for best performance. We found that BERT outperformed the other deep learning architectures and previously reported performances with shallow learning algorithms for classification of segments of reflective writing. BERT starts to outperform the other models when trained on about 20 to 30% of the training data. Furthermore, attribution analyses for inputs yielded insights into important features for BERT's classification decisions. Our study indicates that pretrained language models such as BERT can boost performance for language-related tasks in educational contexts such as classification. KW - Reflective writing KW - NLP KW - Deep learning KW - Science education Y1 - 2022 U6 - https://doi.org/10.1007/s40593-022-00290-6 SN - 1560-4292 SN - 1560-4306 IS - 33 SP - 439 EP - 466 PB - Springer CY - New York ER - TY - JOUR A1 - Shilon, I. A1 - Kraus, M. A1 - Büchele, M. A1 - Egberts, Kathrin A1 - Fischer, Tobias A1 - Holch, Tim Lukas A1 - Lohse, T. A1 - Schwanke, U. A1 - Steppa, Constantin Beverly A1 - Funk, Stefan T1 - Application of deep learning methods to analysis of imaging atmospheric Cherenkov telescopes data JF - Astroparticle physics N2 - Ground based gamma-ray observations with Imaging Atmospheric Cherenkov Telescopes (IACTs) play a significant role in the discovery of very high energy (E > 100 GeV) gamma-ray emitters. The analysis of IACT data demands a highly efficient background rejection technique, as well as methods to accurately determine the position of its source in the sky and the energy of the recorded gamma-ray. We present results for background rejection and signal direction reconstruction from first studies of a novel data analysis scheme for IACT measurements. The new analysis is based on a set of Convolutional Neural Networks (CNNs) applied to images from the four H.E.S.S. phase-I telescopes. As the H.E.S.S. cameras pixels are arranged in a hexagonal array, we demonstrate two ways to use such image data to train CNNs: by resampling the images to a square grid and by applying modified convolution kernels that conserve the hexagonal grid properties. The networks were trained on sets of Monte-Carlo simulated events and tested on both simulations and measured data from the H.E.S.S. array. A comparison between the CNN analysis to current state-of-the-art algorithms reveals a clear improvement in background rejection performance. When applied to H.E.S.S. observation data, the CNN direction reconstruction performs at a similar level as traditional methods. These results serve as a proof-of-concept for the application of CNNs to the analysis of events recorded by IACTs. (C) 2018 Published by Elsevier B.V. KW - Gamma-ray astronomy KW - IACT KW - Analysis technique KW - Deep learning KW - Convolutional neural networks KW - Recurrent neural networks Y1 - 2018 U6 - https://doi.org/10.1016/j.astropartphys.2018.10.003 SN - 0927-6505 SN - 1873-2852 VL - 105 SP - 44 EP - 53 PB - Elsevier CY - Amsterdam ER -