Refine
Has Fulltext
- yes (201) (remove)
Year of publication
Document Type
- Doctoral Thesis (83)
- Postprint (67)
- Article (35)
- Monograph/Edited Volume (9)
- Master's Thesis (3)
- Habilitation Thesis (2)
- Bachelor Thesis (1)
- Conference Proceeding (1)
Language
- English (201) (remove)
Keywords
- Curriculum Framework (34)
- European values education (34)
- Europäische Werteerziehung (34)
- Lehrevaluation (34)
- Studierendenaustausch (34)
- Unterrichtseinheiten (34)
- curriculum framework (34)
- lesson evaluation (34)
- student exchange (34)
- teaching units (34)
Institute
- Institut für Umweltwissenschaften und Geographie (201) (remove)
Deepening Understanding
(2012)
Deepening understanding
(2013)
Assignments, curriculum framework and background information as the base of developing lessons
(2012)
1. What are the general strengths of the assignments? 2. Structure of the assignment 3. Resources of the assignment 4. Fostering self-expression 5. How could you improve the assignment? 6. Lack of specific examples 7. Not relating the issue to the students 8. Language Problems 9. Infeasibility to adaptation 10. In what ways was the additional information useful ? How could this be improved? 11. Was the framework useful for you and in what way? 12. In what ways did the assignments reflect the steps identified in the framework?
Precipitation forecasting has an important place in everyday life – during the day we may have tens of small talks discussing the likelihood that it will rain this evening or weekend. Should you take an umbrella for a walk? Or should you invite your friends for a barbecue? It will certainly depend on what your weather application shows.
While for years people were guided by the precipitation forecasts issued for a particular region or city several times a day, the widespread availability of weather radars allowed us to obtain forecasts at much higher spatiotemporal resolution of minutes in time and hundreds of meters in space. Hence, radar-based precipitation nowcasting, that is, very-short-range forecasting (typically up to 1–3 h), has become an essential technique, also in various professional application contexts, e.g., early warning, sewage control, or agriculture.
There are two major components comprising a system for precipitation nowcasting: radar-based precipitation estimates, and models to extrapolate that precipitation to the imminent future. While acknowledging the fundamental importance of radar-based precipitation retrieval for precipitation nowcasts, this thesis focuses only on the model development: the establishment of open and competitive benchmark models, the investigation of the potential of deep learning, and the development of procedures for nowcast errors diagnosis and isolation that can guide model development.
The present landscape of computational models for precipitation nowcasting still struggles with the availability of open software implementations that could serve as benchmarks for measuring progress. Focusing on this gap, we have developed and extensively benchmarked a stack of models based on different optical flow algorithms for the tracking step and a set of parsimonious extrapolation procedures based on image warping and advection. We demonstrate that these models provide skillful predictions comparable with or even superior to state-of-the-art operational software. We distribute the corresponding set of models as a software library, rainymotion, which is written in the Python programming language and openly available at GitHub (https://github.com/hydrogo/rainymotion). That way, the library acts as a tool for providing fast, open, and transparent solutions that could serve as a benchmark for further model development and hypothesis testing.
One of the promising directions for model development is to challenge the potential of deep learning – a subfield of machine learning that refers to artificial neural networks with deep architectures, which may consist of many computational layers. Deep learning showed promising results in many fields of computer science, such as image and speech recognition, or natural language processing, where it started to dramatically outperform reference methods.
The high benefit of using "big data" for training is among the main reasons for that. Hence, the emerging interest in deep learning in atmospheric sciences is also caused and concerted with the increasing availability of data – both observational and model-based. The large archives of weather radar data provide a solid basis for investigation of deep learning potential in precipitation nowcasting: one year of national 5-min composites for Germany comprises around 85 billion data points.
To this aim, we present RainNet, a deep convolutional neural network for radar-based precipitation nowcasting. RainNet was trained to predict continuous precipitation intensities at a lead time of 5 min, using several years of quality-controlled weather radar composites provided by the German Weather Service (DWD). That data set covers Germany with a spatial domain of 900 km x 900 km and has a resolution of 1 km in space and 5 min in time. Independent verification experiments were carried out on 11 summer precipitation events from 2016 to 2017. In these experiments, RainNet was applied recursively in order to achieve lead times of up to 1 h. In the verification experiments, trivial Eulerian persistence and a conventional model based on optical flow served as benchmarks. The latter is available in the previously developed rainymotion library.
RainNet significantly outperformed the benchmark models at all lead times up to 60 min for the routine verification metrics mean absolute error (MAE) and critical success index (CSI) at intensity thresholds of 0.125, 1, and 5 mm/h. However, rainymotion turned out to be superior in predicting the exceedance of higher intensity thresholds (here 10 and 15 mm/h). The limited ability of RainNet to predict high rainfall intensities is an undesirable property which we attribute to a high level of spatial smoothing introduced by the model. At a lead time of 5 min, an analysis of power spectral density confirmed a significant loss of spectral power at length scales of 16 km and below.
Obviously, RainNet had learned an optimal level of smoothing to produce a nowcast at 5 min lead time. In that sense, the loss of spectral power at small scales is informative, too, as it reflects the limits of predictability as a function of spatial scale. Beyond the lead time of 5 min, however, the increasing level of smoothing is a mere artifact – an analogue to numerical diffusion – that is not a property of RainNet itself but of its recursive application. In the context of early warning, the smoothing is particularly unfavorable since pronounced features of intense precipitation tend to get lost over longer lead times. Hence, we propose several options to address this issue in prospective research on model development for precipitation nowcasting, including an adjustment of the loss function for model training, model training for longer lead times, and the prediction of threshold exceedance.
The model development together with the verification experiments for both conventional and deep learning model predictions also revealed the need to better understand the source of forecast errors. Understanding the dominant sources of error in specific situations should help in guiding further model improvement. The total error of a precipitation nowcast consists of an error in the predicted location of a precipitation feature and an error in the change of precipitation intensity over lead time. So far, verification measures did not allow to isolate the location error, making it difficult to specifically improve nowcast models with regard to location prediction.
To fill this gap, we introduced a framework to directly quantify the location error. To that end, we detect and track scale-invariant precipitation features (corners) in radar images. We then consider these observed tracks as the true reference in order to evaluate the performance (or, inversely, the error) of any model that aims to predict the future location of a precipitation feature. Hence, the location error of a forecast at any lead time ahead of the forecast time corresponds to the Euclidean distance between the observed and the predicted feature location at the corresponding lead time.
Based on this framework, we carried out a benchmarking case study using one year worth of weather radar composites of the DWD. We evaluated the performance of four extrapolation models, two of which are based on the linear extrapolation of corner motion; and the remaining two are based on the Dense Inverse Search (DIS) method: motion vectors obtained from DIS are used to predict feature locations by linear and Semi-Lagrangian extrapolation.
For all competing models, the mean location error exceeds a distance of 5 km after 60 min, and 10 km after 110 min. At least 25% of all forecasts exceed an error of 5 km after 50 min, and of 10 km after 90 min. Even for the best models in our experiment, at least 5 percent of the forecasts will have a location error of more than 10 km after 45 min. When we relate such errors to application scenarios that are typically suggested for precipitation nowcasting, e.g., early warning, it becomes obvious that location errors matter: the order of magnitude of these errors is about the same as the typical extent of a convective cell. Hence, the uncertainty of precipitation nowcasts at such length scales – just as a result of locational errors – can be substantial already at lead times of less than 1 h. Being able to quantify the location error should hence guide any model development that is targeted towards its minimization. To that aim, we also consider the high potential of using deep learning architectures specific to the assimilation of sequential (track) data.
Last but not least, the thesis demonstrates the benefits of a general movement towards open science for model development in the field of precipitation nowcasting. All the presented models and frameworks are distributed as open repositories, thus enhancing transparency and reproducibility of the methodological approach. Furthermore, they are readily available to be used for further research studies, as well as for practical applications.
Optical flow models as an open benchmark for radar-based precipitation nowcasting (rainymotion v0.1)
(2019)
Quantitative precipitation nowcasting (QPN) has become an essential technique in various application contexts, such as early warning or urban sewage control. A common heuristic prediction approach is to track the motion of precipitation features from a sequence of weather radar images and then to displace the precipitation field to the imminent future (minutes to hours) based on that motion, assuming that the intensity of the features remains constant (“Lagrangian persistence”). In that context, “optical flow” has become one of the most popular tracking techniques. Yet the present landscape of computational QPN models still struggles with producing open software implementations. Focusing on this gap, we have developed and extensively benchmarked a stack of models based on different optical flow algorithms for the tracking step and a set of parsimonious extrapolation procedures based on image warping and advection. We demonstrate that these models provide skillful predictions comparable with or even superior to state-of-the-art operational software. Our software library (“rainymotion”) for precipitation nowcasting is written in the Python programming language and openly available at GitHub (https://github.com/hydrogo/rainymotion, Ayzel et al., 2019). That way, the library may serve as a tool for providing fast, free, and transparent solutions that could serve as a benchmark for further model development and hypothesis testing – a benchmark that is far more advanced than the conventional benchmark of Eulerian persistence commonly used in QPN verification experiments.
During the last few decades, the rapid separation of the Small Aral Sea from the isolated basin has changed its hydrological and ecological conditions tremendously. In the present study, we developed and validated the hybrid model for the Syr Darya River basin based on a combination of state-of-the-art hydrological and machine learning models. Climate change impact on freshwater inflow into the Small Aral Sea for the projection period 2007–2099 has been quantified based on the developed hybrid model and bias corrected and downscaled meteorological projections simulated by four General Circulation Models (GCM) for each of three Representative Concentration Pathway scenarios (RCP). The developed hybrid model reliably simulates freshwater inflow for the historical period with a Nash–Sutcliffe efficiency of 0.72 and a Kling–Gupta efficiency of 0.77. Results of the climate change impact assessment showed that the freshwater inflow projections produced by different GCMs are misleading by providing contradictory results for the projection period. However, we identified that the relative runoff changes are expected to be more pronounced in the case of more aggressive RCP scenarios. The simulated projections of freshwater inflow provide a basis for further assessment of climate change impacts on hydrological and ecological conditions of the Small Aral Sea in the 21st Century.
Developing Critical Thinking
(2012)
Developing critical thinking
(2012)
Relating to students
(2013)
Planetary research is often user-based and requires considerable skill, time, and effort. Unfortunately, self-defined boundary conditions, definitions, and rules are often not documented or not easy to comprehend due to the complexity of research. This makes a comparison to other studies, or an extension of the already existing research, complicated. Comparisons are often distorted, because results rely on different, not well defined, or even unknown boundary conditions. The purpose of this research is to develop a standardized analysis method for planetary surfaces, which is adaptable to several research topics. The method provides a consistent quality of results. This also includes achieving reliable and comparable results and reducing the time and effort of conducting such studies. A standardized analysis method is provided by automated analysis tools that focus on statistical parameters. Specific key parameters and boundary conditions are defined for the tool application. The analysis relies on a database in which all key parameters are stored. These databases can be easily updated and adapted to various research questions. This increases the flexibility, reproducibility, and comparability of the research. However, the quality of the database and reliability of definitions directly influence the results. To ensure a high quality of results, the rules and definitions need to be well defined and based on previously conducted case studies. The tools then produce parameters, which are obtained by defined geostatistical techniques (measurements, calculations, classifications). The idea of an automated statistical analysis is tested to proof benefits but also potential problems of this method. In this study, I adapt automated tools for floor-fractured craters (FFCs) on Mars. These impact craters show a variety of surface features, occurring in different Martian environments, and having different fracturing origins. They provide a complex morphological and geological field of application. 433 FFCs are classified by the analysis tools due to their fracturing process. Spatial data, environmental context, and crater interior data are analyzed to distinguish between the processes involved in floor fracturing. Related geologic processes, such as glacial and fluvial activity, are too similar to be separately classified by the automated tools. Glacial and fluvial fracturing processes are merged together for the classification. The automated tools provide probability values for each origin model. To guarantee the quality and reliability of the results, classification tools need to achieve an origin probability above 50 %. This analysis method shows that 15 % of the FFCs are fractured by intrusive volcanism, 20 % by tectonic activity, and 43 % by water & ice related processes. In total, 75 % of the FFCs are classified to an origin type. This can be explained by a combination of origin models, superposition or erosion of key parameters, or an unknown fracturing model. Those features have to be manually analyzed in detail. Another possibility would be the improvement of key parameters and rules for the classification. This research shows that it is possible to conduct an automated statistical analysis of morphologic and geologic features based on analysis tools. Analysis tools provide additional information to the user and are therefore considered assistance systems.
Studies on the unsustainable use of groundwater resources are still considered incipient since it is frequently a poorly understood and managed, devalued and inadequately protected natural resource. Groundwater Recharge (GWR) is one of the most challenging elements to estimate since it can rarely be measured directly and cannot easily be derived from existing data. To overcome these limitations, many hydro(geo)logists have combined different approaches to estimate large-scale GWR, namely: remote sensing products, such as IMERG product; Water Budget Equation, also in combination with hydrological models, and; Geographic Information System (GIS), using estimation formulas. For intermediary-scale GWR estimation, there exist: Non-invasive Cosmic-Ray Neutron Sensing (CRNS); wireless networks from local soil probes; and soil hydrological models, such as HYDRUS. Accordingly, this PhD thesis aims, on the one hand, to demonstrate a GIS-based model coupling for estimating the GWR distribution on a large scale in tropical wet basins. On the other hand, it aims to use the time series from CRNS and invasive soil moisture probes to inversely calibrate the soil hydraulic properties, and based on this, estimating the intermediary-scale GWR using a soil hydrological model. For such purpose, two tropical wet basins located in a complex sedimentary aquifer in the coastal Northeast region of Brazil were selected. These are the João Pessoa Case Study Area and the Guaraíra Experimental Basin. Several satellite products in the first area were used as input to the GIS-based water budget equation model for estimating the water balance components and GWR in 2016 and 2017. In addition, the point-scale measurement and CRNS data were used in the second area to determine the soil hydraulic properties, and to estimate the GWR in the 2017-2018 and 2018-2019 hydrological years. The resulting values of GWR on large- and intermediary-scale were then compared and validated by the estimates obtained by groundwater table fluctuations. The GWR rates for IMERG- and rain-gauge-based scenarios showed similar coefficients between 68% and 89%, similar mean errors between 30% and 34%, and slightly-different bias between -13% and 11%. The results of GWR rates for soil probes and CRNS soil moisture scenarios ranged from -5.87 to -61.81 cm yr-1, which corresponds to 5% and 38% of the precipitation. The calculations of the mean GWR rates on large-scale, based on remote sensing data, and on intermediary-scale, based on CRNS data, held similar results for the Podzol soil type, namely 17.87% and 17% of the precipitation. It is then concluded that the proposed methodologies allowed for estimating realistically the GWR over the study areas, which can be a ground-breaking step towards improving the water management and decision-making in the Northeast of Brazil.
Deepening understanding
(2013)
1. Key concepts 2. What students should have done 3. What students did 4. Deepening understanding 5. General description of deepening understanding 6. Why is deepening understanding an important stage? 7. How does deepening understanding occur in the lessons and some examples 8. Possible difficulties 9. Conclusion
Exploring elections features from a geographical perspective is the focus of this study. Its primary objective is to develop a scientific approach based on geoinformation technology (GIT) that promotes deeper understanding how geographical settings affect the spatial and temporal variations of voting behaviour and election outcomes. For this purpose, the five parliamentary elections (1991-2005) following the political turnaround in 1990 in the South East European reform country Albania have been selected as a case study. Elections, like other social phenomena that do not develop uniformly over a territory, inherit a spatial dimension. Despite of fact that elections have been researched by various scientific disciplines ranging from political science to geography, studies that incorporate their spatial dimension are still limited in number and approaches. Consequently, the methodologies needed to generate an integrated knowledge on many facets that constitute election features are lacking. This study addresses characteristics and interactions of the essential elements involved in an election process. Thus, the baseline of the approach presented here is the exploration of relations between three entities: electorate (political and sociodemographic features), election process (electoral system and code) and place (environment where voters reside). To express this interaction the concept of electoral pattern is introduced. Electoral patterns are defined by the study as the final view of election results, chiefly in tabular and/or map form, generated by the complex interaction of social, economic, juridical, and spatial features of the electorate, which has occurred at a specific time and in a particular geographical location. GIT methods of geoanalysis and geovisualization are used to investigate the characteristics of electoral patterns in their spatial and temporal distribution. Aggregate-level data modelled in map form were used to analyse and visualize the spatial distribution of election patterns components and relations. The spatial dimension of the study is addressed in the following three main relations: One, the relation between place and electorate and its expression through the social, demographic and economic features of the electorate resulting in the profile of the electorate’s context; second, the electorate-election interaction which forms the baseline to explore the perspective of local contextual effects in voting behaviour and election results; third, the relation between geographical location and election outcomes reflecting the implication of determining constituency boundaries on election results. To address the above relations, three types of variables: geo, independent and dependent, have been elaborated and two models have been created. The Data Model, developed in a GIS environment, facilitates structuring of election data in order to perform spatial analysis. The peculiarity of electoral patterns – a multidimensional array that contains information on three variables, stored in data layers of dissimilar spatial units of reference and scales of value measurement – prohibit spatial analysis based on the original source data. To perform a joint spatial analysis it is therefore mandatory to restructure the spatial units of reference while preserving their semantic content. In this operation, all relevant electoral as well as socio-demographic data referenced to different administrative spatial entities are re-referenced to uniform grid cells as virtual spatial units of reference. Depending on the scale of data acquisition and map presentation, a cell width of 0.5 km has been determined. The resulting fine grid forms the basis of subsequent data analyses and correlations. Conversion of the original vector data layers into target raster layers allows for unification of spatial units, at the same time retaining the existing level of detail of the data (variables, uniform distribution over space). This in turn facilitates the integration of the variables studied and the performance of GIS-based spatial analysis. In addition, conversion to raster format makes it possible to assign new values to the original data, which are based on a common scale eliminating existing differences in scale of measurement. Raster format operations of the type described are well-established data analysis techniques in GIT, yet they have rarely been employed to process and analyse electoral data. The Geovisualization Model, developed in a cartographic environment, complements the Data Model. As an analog graphic model it facilitates efficient communication and exploration of geographical information through cartographic visualization. Based on this model, 52 choropleth maps have been generated. They represent the outcome of the GIS-based electoral data analysis. The analog map form allows for in-depth visual analysis and interpretation of the distribution and correlation of the electoral data studied. For researchers, decision makers and a wider public the maps provide easy-to-access information on and promote easy-to-understand insight into the spatial dimension, regional variation and resulting structures of the electoral patterns defined.
Submerged sequences of marine terraces potentially provide crucial information of past sea-level positions. However, the distribution and characteristics of drowned marine terrace sequences are poorly known at a global scale. Using bathymetric data and novel mapping and modeling techniques, we studied a submerged sequence of marine terraces in the Bay of Biscay with the objective to identify the distribution and morphologies of submerged marine terraces and the timing and conditions that allowed their formation and preservation. To accomplish the objectives a high-resolution bathymetry (5 m) was analyzed using Geographic Information Systems and TerraceM(R). The successive submerged terraces were identified using a Surface Classification Model, which linearly combines the slope and the roughness of the surface to extract fossil sea-cliffs and fossil rocky shore platforms. For that purpose, contour and hillshaded maps were also analyzed. Then, shoreline angles, a geomorphic marker located at the intersection between the fossil sea-cliff and platform, were mapped analyzing swath profiles perpendicular to the isobaths. Most of the submerged strandlines are irregularly preserved throughout the continental shelf. In summary, 12 submerged terraces with their shoreline angles between approximately: -13 m (T1), -30 and -32 m (T2), -34 and 41 m (T3), -44 and -47 m (T4), -49 and 53 m (T5), -55 and 58 m (T6), -59 and 62 m (T7), -65 and 67 m (T8), -68 and 70 m (T9), -74 and -77 m (T10), -83 and -86 m (T11) and -89 and 92 m (T12). Nevertheless, the ones showing the best lateral continuity and preservation in the central part of the shelf are T3, T4, T5, T7, T8, and T10. The age of the terraces has been estimated using a landscape evolution model. To simulate the formation and preservation of submerged terraces three different scenarios: (i) 20-0 ka; (ii) 128-0 ka; and (iii) 128-20 ka, were compared. The best scenario for terrace generation was between 128 and 20 Ka, where T3, T5, and T7 could have been formed.
Relating to students
(2012)
1. The Assignment 'Devotion to Religion and acitive Citizenship' 2. The Assignment 'How are religious spread across Europe' 3. The Assignment 'Is football as important as religion?' 4. The Assignment 'Why be religious?' 5. The Assignment 'Lucky charms' 6. The Assignment 'No Creo en el Jamas' (Life after death) 7. The Assignment 'Religion and its influence on politics ans policies' 8. The Assignment 'Secularisation in Europe' 9. The Assignment 'The meaning of religious places' 10. The Assignment 'Unity in diversity' 11. Which conceptions did you find?
Streamflow dynamics in mountainous environments are controlled by runoff generation processes in the basin upstream. Runoff generation processes are thus a major control of the terrestrial part of the water cycle, influencing both, water quality and water quantity as well as their dynamics. The understanding of these processes becomes especially important for the prediction of floods, erosion, and dangerous mass movements, in particular as hydrological systems often show threshold behavior. In case of extensive environmental changes, be it in climate or in landuse, the understanding of runoff generation processes will allow us to better anticipate the consequences and can thus lead to a more responsible management of resources as well as risks. In this study the runoff generation processes in a small undisturbed catchment in the Chilean Andes were investigated. The research area is characterized by steep hillslopes, volcanic ash soils, undisturbed old growth forest and high rainfall amounts. The investigation of runoff generation processes in this data scarce area is of special interest as a) little is known on the hydrological functioning of the young volcanic ash soils, which are characterized by extremely high porosities and hydraulic conductivities, b) no process studies have been carried out in this area at either slope or catchment scale, and c) understanding the hydrological processes in undisturbed catchments will provide a basis to improve our understanding of disturbed systems, the shift in processes that followed the disturbance and maybe also future process evolution necessary for the achievement of a new steady state. The here studied catchment has thus the potential to serve as a reference catchment for future investigations. As no long term data of rainfall and runoff exists, it was necessary to replace long time series of data with a multitude of experimental methods, using the so called "multi-method approach". These methods cover as many aspects of runoff generation as possible and include not only the measurement of time series such as discharge, rainfall, soil water dynamics and groundwater dynamics, but also various short term measurements and experiments such as determination of throughfall amounts and variability, water chemistry, soil physical parameters, soil mineralogy, geo-electrical soundings and tracer techniques. Assembling the results like pieces of a puzzle produces a maybe not complete but nevertheless useful picture of the dynamic ensemble of runoff generation processes in this catchment. The employed methods were then evaluated for their usefulness vs. expenditures (labour and financial costs). Finally, the hypotheses - the perceptual model of runoff generation generated from the experimental findings - were tested with the physically based model Catflow. Additionally the process-based model Wasim-ETH was used to investigate the influence of landuse on runoff generation at the catchment scale. An initial assessment of hydrologic response of the catchment was achieved with a linear statistical model for the prediction of event runoff coefficients. The parameters identified as best predictors give a first indication of important processes. Various results acquired with the "multi-method approach" show that response to rainfall is generally fast. Preferential vertical flow is of major importance and is reinforced by hydrophobicity during the summer months. Rapid lateral water transport is necessary to produce the fast response signal, however, while lateral subsurface flow was observed at several soil moisture profiles, the location and type of structures causing fast lateral flow on the hillslope scale is still not clear and needs to be investigated in more detail. Surface runoff has not been observed and is unlikely due to the high hydraulic conductivities of the volcanic ash soils. Additionally, a large subsurface storage retains most of the incident rainfall amount during events (>90%, often even >95%) and produces streamflow even after several weeks of drought. Several findings suggest a shift in processes from summer to winter causing changes in flow patterns, changes in response of stream chemistry to rainfall events and also in groundwater-surface water interactions. The results of the modelling study confirm the importance of rapid and preferential flow processes. However, due to the limited knowledge on subsurface structures the model still does not fully capture runoff response. Investigating the importance of landuse on runoff generation showed that while peak runoff generally increased with deforested area, the location of these areas also had an effect. Overall, the "multi-method approach" of replacing long time series with a multitude of experimental methods was successful in the identification of dominant hydrological processes and thus proved its applicability for data scarce catchments under the constraint of limited resources.
Spatial patterns as well as temporal dynamics of soil moisture have a major influence on runoff generation. The investigation of these dynamics and patterns can thus yield valuable information on hydrological processes, especially in data scarce or previously ungauged catchments. The combination of spatially scarce but temporally high resolution soil moisture profiles with episodic and thus temporally scarce moisture profiles at additional locations provides information on spatial as well as temporal patterns of soil moisture at the hillslope transect scale. This approach is better suited to difficult terrain (dense forest, steep slopes) than geophysical techniques and at the same time less cost-intensive than a high resolution grid of continuously measuring sensors. Rainfall simulation experiments with dye tracers while continuously monitoring soil moisture response allows for visualization of flow processes in the unsaturated zone at these locations. Data was analyzed at different spacio-temporal scales using various graphical methods, such as space-time colour maps (for the event and plot scale) and binary indicator maps (for the long-term and hillslope scale). Annual dynamics of soil moisture and decimeterscale variability were also investigated. The proposed approach proved to be successful in the investigation of flow processes in the unsaturated zone and showed the importance of preferential flow in the Malalcahuello Catchment, a datascarce catchment in the Andes of Southern Chile. Fast response times of stream flow indicate that preferential flow observed at the plot scale might also be of importance at the hillslope or catchment scale. Flow patterns were highly variable in space but persistent in time. The most likely explanation for preferential flow in this catchment is a combination of hydrophobicity, small scale heterogeneity in rainfall due to redistribution in the canopy and strong gradients in unsaturated conductivities leading to self-reinforcing flow paths.
Earth observation data have become an outstanding basis for analyzing environmental
aspects. The increasing availability of remote sensing data is accompanied
by an increasing user demand. Within the scope of the COOPERNICUS-initiative,
the automatic processing of remote sensing data is important for supplying value-
added-information products. The use of additional data like land-water-masks
in the context of deriving value-added information products can stabilize and
improve the product quality of information products.
The authors of this contribution would like to discuss different automated
processing algorithms which are based on land-water masks for value-added
data interpretation. These developments were supported or accompanied by Prof.
Hartmut Asche.
The EVE curriculum framework
(2012)