TY - CHAP A1 - Grum, Marcus ED - Rutkowski, Leszek ED - Scherer, Rafał ED - Korytkowski, Marcin ED - Pedrycz, Witold ED - Tadeusiewicz, Ryszard ED - Zurada, Jacek M. T1 - Learning representations by crystallized back-propagating errors T2 - Artificial intelligence and soft computing N2 - With larger artificial neural networks (ANN) and deeper neural architectures, common methods for training ANN, such as backpropagation, are key to learning success. Their role becomes particularly important when interpreting and controlling structures that evolve through machine learning. This work aims to extend previous research on backpropagation-based methods by presenting a modified, full-gradient version of the backpropagation learning algorithm that preserves (or rather crystallizes) selected neural weights while leaving other weights adaptable (or rather fluid). In a design-science-oriented manner, a prototype of a feedforward ANN is demonstrated and refined using the new learning method. The results show that the so-called crystallizing backpropagation increases the control possibilities of neural structures and interpretation chances, while learning can be carried out as usual. Since neural hierarchies are established because of the algorithm, ANN compartments start to function in terms of cognitive levels. This study shows the importance of dealing with ANN in hierarchies through backpropagation and brings in learning methods as novel ways of interacting with ANN. Practitioners will benefit from this interactive process because they can restrict neural learning to specific architectural components of ANN and can focus further development on specific areas of higher cognitive levels without the risk of destroying valuable ANN structures. KW - artificial neural networks KW - backpropagation KW - knowledge crystallization KW - second-order conditioning KW - cognitive levels KW - NMDL Y1 - 2023 SN - 978-3-031-42504-2 SN - 978-3-031-42505-9 U6 - https://doi.org/10.1007/978-3-031-42505-9_8 SP - 78 EP - 100 PB - Springer CY - Cham ER - TY - JOUR A1 - Mey, Jürgen A1 - Scherler, Dirk A1 - Zeilinger, Gerold A1 - Strecker, Manfred T1 - Estimating the fill thickness and bedrock topography in intermontane valleys using artificial neural networks JF - Journal of geophysical research : Earth surface N2 - Thick sedimentary fills in intermontane valleys are common in formerly glaciated mountain ranges but difficult to quantify. Yet knowledge of the fill thickness distribution could help to estimate sediment budgets of mountain belts and to decipher the role of stored material in modulating sediment flux from the orogen to the foreland. Here we present a new approach to estimate valley fill thickness and bedrock topography based on the geometric properties of a landscape using artificial neural networks. We test the potential of this approach following a four-tiered procedure. First, experiments with synthetic, idealized landscapes show that increasing variability in surface slopes requires successively more complex network configurations. Second, in experiments with artificially filled natural landscapes, we find that fill volumes can be estimated with an error below 20%. Third, in natural examples with valley fill surfaces that have steeply inclined slopes, such as the Unteraar and the Rhone Glaciers in the Swiss Alps, for example, the average deviation of cross-sectional area between the measured and the modeled valley fill is 26% and 27%, respectively. Finally, application of the method to the Rhone Valley, an overdeepened glacial valley in the Swiss Alps, yields a total estimated sediment volume of 9711km(3) and an average deviation of cross-sectional area between measurements and model estimates of 21.5%. Our new method allows for rapid assessment of sediment volumes in intermontane valleys while eliminating most of the subjectivity that is typically inherent in other methods where bedrock reconstructions are based on digital elevation models. KW - sediment storage KW - sediment thickness KW - intermontane valleys KW - geomorphometry KW - artificial neural networks Y1 - 2015 U6 - https://doi.org/10.1002/2014JF003270 SN - 2169-9003 SN - 2169-9011 VL - 120 IS - 7 SP - 1301 EP - 1320 PB - American Geophysical Union CY - Washington ER -