Refine
Has Fulltext
- yes (20)
Year of publication
- 2017 (20) (remove)
Document Type
- Postprint (20) (remove)
Language
- English (20)
Is part of the Bibliography
- yes (20) (remove)
Keywords
- climate change (2)
- hyperspectral (2)
- imaging spectroscopy (2)
- "Little Ice Age' (LIA) (1)
- "Medieval Warm Period' (MWP) (1)
- Central Asia (1)
- EROEI (1)
- EnGeoMAP 2.0 (1)
- EnMAP (1)
- Fen complex (1)
Institute
- Institut für Geowissenschaften (20) (remove)
In this study, we validate and compare elevation accuracy and geomorphic metrics of satellite-derived digital elevation models (DEMs) on the southern Central Andean Plateau. The plateau has an average elevation of 3.7 km and is characterized by diverse topography and relief, lack of vegetation, and clear skies that create ideal conditions for remote sensing. At 30m resolution, SRTM-C, ASTER GDEM2, stacked ASTER L1A stereopair DEM, ALOS World 3D, and TanDEM-X have been analyzed. The higher-resolution datasets include 12m TanDEM-X, 10m single-CoSSC TerraSAR-X/TanDEM-X DEMs, and 5m ALOS World 3D. These DEMs are state of the art for optical (ASTER and ALOS) and radar (SRTM-C and TanDEM-X) spaceborne sensors. We assessed vertical accuracy by comparing standard deviations of the DEM elevation versus 307 509 differential GPS measurements across 4000m of elevation. For the 30m DEMs, the ASTER datasets had the highest vertical standard deviation at > 6.5 m, whereas the SRTM-C, ALOS World 3D, and TanDEM-X were all < 3.5 m. Higher-resolution DEMs generally had lower uncertainty, with both the 12m TanDEM-X and 5m ALOSWorld 3D having < 2m vertical standard deviation. Analysis of vertical uncertainty with respect to terrain elevation, slope, and aspect revealed the low uncertainty across these attributes for SRTM-C (30 m), TanDEM-X (12–30 m), and ALOS World 3D (5–30 m). Single-CoSSC TerraSAR-X/TanDEM-X 10m DEMs and the 30m ASTER GDEM2 displayed slight aspect biases, which were removed in their stacked counterparts (TanDEM-X and ASTER Stack). Based on low vertical standard deviations and visual inspection alongside optical satellite data, we selected the 30m SRTM-C, 12–30m TanDEM-X, 10m single-CoSSC TerraSAR-X/TanDEM-X, and 5m ALOS World 3D for geomorphic metric comparison in a 66 km2 catchment with a distinct river knickpoint. Consistent m=n values were found using chi plot channel profile analysis, regardless of DEM type and spatial resolution. Slope, curvature, and drainage area were calculated and plotting schemes were used to assess basin-wide differences in the hillslope-to-valley transition related to the knickpoint. While slope and hillslope length measurements vary little between datasets, curvature displays higher magnitude measurements with fining resolution. This is especially true for the optical 5m ALOS World 3D DEM, which demonstrated high-frequency noise in 2–8 pixel steps through a Fourier frequency analysis. The improvements in accurate space-radar DEMs (e.g., TanDEM-X) for geomorphometry are promising, but airborne or terrestrial data are still necessary for meter-scale analysis.
Human development has far-reaching impacts on the surface of the globe. The transformation of natural land cover occurs in different forms, and urban growth is one of the most eminent transformative processes. We analyze global land cover data and extract cities as defined by maximally connected urban clusters. The analysis of the city size distribution for all cities on the globe confirms Zipf’s law. Moreover, by investigating the percolation properties of the clustering of urban areas we assess the closeness to criticality for various countries. At the critical thresholds, the urban land cover of the countries undergoes a transition from separated clusters to a gigantic component on the country scale. We study the Zipf-exponents as a function of the closeness to percolation and find a systematic dependence, which could be the reason for deviating exponents reported in the literature. Moreover, we investigate the average size of the clusters as a function of the proximity to percolation and find country specific behavior. By relating the standard deviation and the average of cluster sizes—analogous to Taylor’s law—we suggest an alternative way to identify the percolation transition. We calculate spatial correlations of the urban land cover and find long-range correlations. Finally, by relating the areas of cities with population figures we address the global aspect of the allometry of cities, finding an exponent δ ≈ 0.85, i.e., large cities have lower densities.
High Mountain Asia (HMA) - encompassing the Tibetan Plateau and surrounding mountain ranges - is the primary water source for much of Asia, serving more than a billion downstream users. Many catchments receive the majority of their yearly water budget in the form of snow, which is poorly monitored by sparse in situ weather networks. Both the timing and volume of snowmelt play critical roles in downstream water provision, as many applications - such as agriculture, drinking-water generation, and hydropower - rely on consistent and predictable snowmelt runoff. Here, we examine passive microwave data across HMA with five sensors (SSMI, SSMIS, AMSR-E, AMSR2, and GPM) from 1987 to 2016 to track the timing of the snowmelt season - defined here as the time between maximum passive microwave signal separation and snow clearance. We validated our method against climate model surface temperatures, optical remote-sensing snow-cover data, and a manual control dataset (n = 2100, 3 variables at 25 locations over 28 years); our algorithm is generally accurate within 3-5 days. Using the algorithm-generated snowmelt dates, we examine the spatiotemporal patterns of the snowmelt season across HMA. The climatically short (29-year) time series, along with complex interannual snowfall variations, makes determining trends in snowmelt dates at a single point difficult. We instead identify trends in snowmelt timing by using hierarchical clustering of the passive microwave data to determine trends in self-similar regions. We make the following four key observations. (1) The end of the snowmelt season is trending almost universally earlier in HMA (negative trends). Changes in the end of the snowmelt season are generally between 2 and 8 days decade 1 over the 29-year study period (5-25 days total). The length of the snowmelt season is thus shrinking in many, though not all, regions of HMA. Some areas exhibit later peak signal separation (positive trends), but with generally smaller magnitudes than trends in snowmelt end. (2) Areas with long snowmelt periods, such as the Tibetan Plateau, show the strongest compression of the snowmelt season (negative trends). These trends are apparent regardless of the time period over which the regression is performed. (3) While trends averaged over 3 decades indicate generally earlier snowmelt seasons, data from the last 14 years (2002-2016) exhibit positive trends in many regions, such as parts of the Pamir and Kunlun Shan. Due to the short nature of the time series, it is not clear whether this change is a reversal of a long-term trend or simply interannual variability. (4) Some regions with stable or growing glaciers - such as the Karakoram and Kunlun Shan - see slightly later snowmelt seasons and longer snowmelt periods. It is likely that changes in the snowmelt regime of HMA account for some of the observed heterogeneity in glacier response to climate change. While the decadal increases in regional temperature have in general led to earlier and shortened melt seasons, changes in HMA's cryosphere have been spatially and temporally heterogeneous.
Regional snow-avalanche detection using object-based image analysis of near-infrared aerial imagery
(2017)
Snow avalanches are destructive mass movements in mountain regions that continue to claim lives and cause infrastructural damage and traffic detours. Given that avalanches often occur in remote and poorly accessible steep terrain, their detection and mapping is extensive and time consuming. Nonetheless, systematic avalanche detection over large areas could help to generate more complete and up-to-date inventories (cadastres) necessary for validating avalanche forecasting and hazard mapping. In this study, we focused on automatically detecting avalanches and classifying them into release zones, tracks, and run-out zones based on 0.25 m near-infrared (NIR) ADS80-SH92 aerial imagery using an object-based image analysis (OBIA) approach. Our algorithm takes into account the brightness, the normalised difference vegetation index (NDVI), the normalised difference water index (NDWI), and its standard deviation (SDNDWI) to distinguish avalanches from other land-surface elements. Using normalised parameters allows applying this method across large areas. We trained the method by analysing the properties of snow avalanches at three 4 km−2 areas near Davos, Switzerland. We compared the results with manually mapped avalanche polygons and obtained a user's accuracy of > 0.9 and a Cohen's kappa of 0.79–0.85. Testing the method for a larger area of 226.3 km−2, we estimated producer's and user's accuracies of 0.61 and 0.78, respectively, with a Cohen's kappa of 0.67. Detected avalanches that overlapped with reference data by > 80 % occurred randomly throughout the testing area, showing that our method avoids overfitting. Our method has potential for large-scale avalanche mapping, although further investigations into other regions are desirable to verify the robustness of our selected thresholds and the transferability of the method.
Model-Based attribution of high-resolution streamflow trends in two alpine basins of Western Austria
(2017)
Several trend studies have shown that hydrological conditions are changing considerably in the Alpine region. However, the reasons for these changes are only partially understood and trend analyses alone are not able to shed much light. Hydrological modelling is one possible way to identify the trend drivers, i.e., to attribute the detected streamflow trends, given that the model captures all important processes causing the trends. We modelled the hydrological conditions for two alpine catchments in western Austria (a large, mostly lower-altitude catchment with wide valley plains and a nested high-altitude, glaciated headwater catchment) with the distributed, physically-oriented WaSiM-ETH model, which includes a dynamical glacier module. The model was calibrated in a transient mode, i.e., not only on several standard goodness measures and glacier extents, but also in such a way that the simulated streamflow trends fit with the observed ones during the investigation period 1980 to 2007. With this approach, it was possible to separate streamflow components, identify the trends of flow components, and study their relation to trends in atmospheric variables. In addition to trends in annual averages, highly resolved trends for each Julian day were derived, since they proved powerful in an earlier, data-based attribution study. We were able to show that annual and highly resolved trends can be modelled sufficiently well. The results provide a holistic, year-round picture of the drivers of alpine streamflow changes: Higher-altitude catchments are strongly affected by earlier firn melt and snowmelt in spring and increased ice melt throughout the ablation season. Changes in lower-altitude areas are mostly caused by earlier and lower snowmelt volumes. All highly resolved trends in streamflow and its components show an explicit similarity to the local temperature trends. Finally, results indicate that evapotranspiration has been increasing in the lower altitudes during the study period.
The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all.
Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation.
In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.
In this study, an in situ application for identifying neodymium (Nd) enriched surface materials that uses multitemporal hyperspectral images is presented (HySpex sensor). Because of the narrow shape and shallow absorption depth of the neodymium absorption feature, a method was developed for enhancing and extracting the necessary information for neodymium from image spectra, even under illumination conditions that are not optimal. For this purpose, the two following approaches were developed: (1) reducing noise and analyzing changing illumination conditions by averaging multitemporal image scenes and (2) enhancing the depth of the desired absorption band by deconvolving every image spectrum with a Gaussian curve while the rest of the spectrum remains unchanged (Richardson-Lucy deconvolution). To evaluate these findings, nine field samples from the Fen complex in Norway were analyzed using handheld X-ray fluorescence devices and by conducting detailed laboratory-based geochemical rare earth element determinations. The result is a qualitative outcrop map that highlights zones that are enriched in neodymium. To reduce the influences of non-optimal illumination, particularly at the studied site, a minimum of seven single acquisitions is required. Sharpening the neodymium absorption band allows for robust mapping, even at the outer zones of enrichment. From the geochemical investigations, we found that iron oxides decrease the applicability of the method. However, iron-related absorption bands can be used as secondary indicators for sulfidic ore zones that are mainly enriched with rare earth elements. In summary, we found that hyperspectral spectroscopy is a noninvasive, fast and cost-saving method for determining neodymium at outcrop surfaces
In 2009, a group of prominent Earth scientists introduced the "planetary boundaries" (PB) framework: they suggested nine global control variables, and defined corresponding "thresholds which, if crossed, could generate unacceptable environmental change". The concept builds on systems theory, and views Earth as a complex adaptive system in which anthropogenic disturbances may trigger non-linear, abrupt, and irreversible changes at the global scale, and "push the Earth system outside the stable environmental state of the Holocene". While the idea has been remarkably successful in both science and policy circles, it has also raised fundamental concerns, as the majority of suggested processes and their corresponding planetary boundaries do not operate at the global scale, and thus apparently lack the potential to trigger abrupt planetary changes.
This paper picks up the debate with specific regard to the planetary boundary on "global freshwater use". While the bio-physical impacts of excessive water consumption are typically confined to the river basin scale, the PB proponents argue that water-induced environmental disasters could build up to planetary-scale feedbacks and system failures. So far, however, no evidence has been presented to corroborate that hypothesis. Furthermore, no coherent approach has been presented to what extent a planetary threshold value could reflect the risk of regional environmental disaster. To be sure, the PB framework was revised in 2015, extending the planetary freshwater boundary with a set of basin-level boundaries inferred from environmental water flow assumptions. Yet, no new evidence was presented, either with respect to the ability of those basin-level boundaries to reflect the risk of regional regime shifts or with respect to a potential mechanism linking river basins to the planetary scale.
So while the idea of a planetary boundary on freshwater use appears intriguing, the line of arguments presented so far remains speculative and implicatory. As long as Earth system science does not present compelling evidence, the exercise of assigning actual numbers to such a boundary is arbitrary, premature, and misleading. Taken as a basis for water-related policy and management decisions, though, the idea transforms from misleading to dangerous, as it implies that we can globally offset water-related environmental impacts. A planetary boundary on freshwater use should thus be disapproved and actively refuted by the hydrological and water resources community.
Meteorological extreme events have great potential for damaging railway infrastructure and posing risks to the safety of train passengers. In the future, climate change will presumably have serious implications on meteorological hazards in the Alpine region. Hence, attaining insights on future frequencies of meteorological extremes with relevance for the railway operation in Austria is required in the context of a comprehensive and sustainable natural hazard management plan of the railway operator. In this study, possible impacts of climate change on the frequencies of so-called critical meteorological conditions (CMCs) between the periods 1961-1990 and 2011-2040 are analyzed. Thresholds for such CMCs have been defined by the railway operator and used in its weather monitoring and early warning system. First, the seasonal climate change signals for air temperature and precipitation in Austria are described on the basis of an ensemble of high-resolution Regional Climate Model (RCM) simulations for Europe. Subsequently, the RCM-ensemble was used to investigate changes in the frequency of CMCs. Finally, the sensitivity of results is analyzed with varying threshold values for the CMCs. Results give robust indications for an all-season air temperature rise, but show no clear tendency in average precipitation. The frequency analyses reveal an increase in intense rainfall events and heat waves, whereas heavy snowfall and cold days are likely to decrease. Furthermore, results indicate that frequencies of CMCs are rather sensitive to changes of thresholds. It thus emphasizes the importance to carefully define, validate, andif neededto adapt the thresholds that are used in the weather monitoring and warning system of the railway operator. For this, continuous and standardized documentation of damaging events and near-misses is a pre-requisite.
EnGeoMAP 2.0
(2017)
Algorithms for a rapid analysis of hyperspectral data are becoming more and more important with planned next generation spaceborne hyperspectral missions such as the Environmental Mapping and Analysis Program (EnMAP) and the Japanese Hyperspectral Imager Suite (HISUI), together with an ever growing pool of hyperspectral airborne data. The here presented EnGeoMAP 2.0 algorithm is an automated system for material characterization from imaging spectroscopy data, which builds on the theoretical framework of the Tetracorder and MICA (Material Identification and Characterization Algorithm) of the United States Geological Survey and of EnGeoMAP 1.0 from 2013. EnGeoMAP 2.0 includes automated absorption feature extraction, spatio-spectral gradient calculation and mineral anomaly detection. The usage of EnGeoMAP 2.0 is demonstrated at the mineral deposit sites of Rodalquilar (SE-Spain) and Haib River (S-Namibia) using HyMAP and simulated EnMAP data. Results from Hyperion data are presented as supplementary information.