Institut für Erd- und Umweltwissenschaften
Filtern
Volltext vorhanden
- ja (48)
Dokumenttyp
- Postprint (48) (entfernen)
Sprache
- Englisch (48)
Schlagworte
- climate change (3)
- Lake Van (2)
- climate extremes (2)
- deep biosphere (2)
- events (2)
- imaging spectroscopy (2)
- impacts (2)
- streamflow (2)
- temperature (2)
- time-series (2)
Institut
In this study, we analyze interactions in lake and lake catchment systems of a continuous permafrost area. We assessed colored dissolved organic matter (CDOM) absorption at 440 nm (a(440)(CDOM)) and absorption slope (S300-500) in lakes using field sampling and optical remote sensing data for an area of 350 km(2) in Central Yamal, Siberia. Applying a CDOM algorithm (ratio of green and red band reflectance) for two high spatial resolution multispectral GeoEye-1 and Worldview-2 satellite images, we were able to extrapolate the a()(CDOM) data from 18 lakes sampled in the field to 356 lakes in the study area (model R-2 = 0.79). Values of a(440)(CDOM) in 356 lakes varied from 0.48 to 8.35 m(-1) with a median of 1.43 m(-1). This a()(CDOM) dataset was used to relate lake CDOM to 17 lake and lake catchment parameters derived from optical and radar remote sensing data and from digital elevation model analysis in order to establish the parameters controlling CDOM in lakes on the Yamal Peninsula. Regression tree model and boosted regression tree analysis showed that the activity of cryogenic processes (thermocirques) in the lake shores and lake water level were the two most important controls, explaining 48.4% and 28.4% of lake CDOM, respectively (R-2 = 0.61). Activation of thermocirques led to a large input of terrestrial organic matter and sediments from catchments and thawed permafrost to lakes (n = 15, mean a(440)(CDOM) = 5.3 m(-1)). Large lakes on the floodplain with a connection to Mordy-Yakha River received more CDOM (n = 7, mean a(440)(CDOM) = 3.8 m(-1)) compared to lakes located on higher terraces.
Changes in species' distributions are classically projected based on their climate envelopes. For Siberian forests, which have a tremendous significance for vegetation-climate feedbacks, this implies future shifts of each of the forest-forming larch (Larix) species to the north-east. However, in addition to abiotic factors, reliable projections must assess the role of historical biogeography and biotic interactions. Here, we use sedimentary ancient DNA and individual-based modelling to investigate the distribution of larch species and mitochondrial haplotypes through space and time across the treeline ecotone on the southern Taymyr peninsula, which at the same time presents a boundary area of two larch species. We find spatial and temporal patterns, which suggest that forest density is the most influential driver determining the precise distribution of species and mitochondrial haplotypes. This suggests a strong influence of competition on the species' range shifts. These findings imply possible climate change outcomes that are directly opposed to projections based purely on climate envelopes. Investigations of such fine-scale processes of biodiversity change through time are possible using paleoenvironmental DNA, which is available much more readily than visible fossils and can provide information at a level of resolution that is not reached in classical palaeoecology.
Dating growth strata and basin fill by combining 26Al/10Be burial dating and magnetostratigraphy
(2018)
Cosmogenic burial dating enables dating of coarse-grained, Pliocene-Pleistocene sedimentary units that are typically difficult to date with traditional methods, such as magnetostratigraphy. In the actively deforming western Tarim Basin in NW China, Pliocene-Pleistocene conglomerates were dated at eight sites, integrating Al-26/Be-10 burial dating with previously published magnetostratigraphic sections. These samples were collected from growth strata on the flanks of growing folds and from sedimentary units beneath active faults to place timing constraints on the initiation of deformation of structures within the basin and on shortening rates on active faults. These new basin-fill and growthstrata ages document the late Neogene and Quaternary growth of the Pamir and Tian Shan orogens between >5 and 1 Ma and delineate the eastward propagation of deformation at rates up to 115 km/m.y. and basinward growth of both mountain belts at rates up to 12 km/m.y.
RainNet v1.0
(2020)
In this study, we present RainNet, a deep convolutional neural network for radar-based precipitation nowcasting. Its design was inspired by the U-Net and SegNet families of deep learning models, which were originally designed for binary segmentation tasks. RainNet was trained to predict continuous precipitation intensities at a lead time of 5min, using several years of quality-controlled weather radar composites provided by the German Weather Service (DWD). That data set covers Germany with a spatial domain of 900km × 900km and has a resolution of 1km in space and 5min in time. Independent verification experiments were carried out on 11 summer precipitation events from 2016 to 2017. In order to achieve a lead time of 1h, a recursive approach was implemented by using RainNet predictions at 5min lead times as model inputs for longer lead times. In the verification experiments, trivial Eulerian persistence and a conventional model based on optical flow served as benchmarks. The latter is available in the rainymotion library and had previously been shown to outperform DWD's operational nowcasting model for the same set of verification events.
RainNet significantly outperforms the benchmark models at all lead times up to 60min for the routine verification metrics mean absolute error (MAE) and the critical success index (CSI) at intensity thresholds of 0.125, 1, and 5mm h⁻¹. However, rainymotion turned out to be superior in predicting the exceedance of higher intensity thresholds (here 10 and 15mm h⁻¹). The limited ability of RainNet to predict heavy rainfall intensities is an undesirable property which we attribute to a high level of spatial smoothing introduced by the model. At a lead time of 5min, an analysis of power spectral density confirmed a significant loss of spectral power at length scales of 16km and below. Obviously, RainNet had learned an optimal level of smoothing to produce a nowcast at 5min lead time. In that sense, the loss of spectral power at small scales is informative, too, as it reflects the limits of predictability as a function of spatial scale. Beyond the lead time of 5min, however, the increasing level of smoothing is a mere artifact – an analogue to numerical diffusion – that is not a property of RainNet itself but of its recursive application. In the context of early warning, the smoothing is particularly unfavorable since pronounced features of intense precipitation tend to get lost over longer lead times. Hence, we propose several options to address this issue in prospective research, including an adjustment of the loss function for model training, model training for longer lead times, and the prediction of threshold exceedance in terms of a binary segmentation task. Furthermore, we suggest additional input data that could help to better identify situations with imminent precipitation dynamics. The model code, pretrained weights, and training data are provided in open repositories as an input for such future studies.
Hydrometric networks play a vital role in providing information for decision-making in water resource management. They should be set up optimally to provide as much information as possible that is as accurate as possible and, at the same time, be cost-effective. Although the design of hydrometric networks is a well-identified problem in hydrometeorology and has received considerable attention, there is still scope for further advancement. In this study, we use complex network analysis, defined as a collection of nodes interconnected by links, to propose a new measure that identifies critical nodes of station networks. The approach can support the design and redesign of hydrometric station networks. The science of complex networks is a relatively young field and has gained significant momentum over the last few years in different areas such as brain networks, social networks, technological networks, or climate networks. The identification of influential nodes in complex networks is an important field of research. We propose a new node-ranking measure – the weighted degree–betweenness (WDB) measure – to evaluate the importance of nodes in a network. It is compared to previously proposed measures used on synthetic sample networks and then applied to a real-world rain gauge network comprising 1229 stations across Germany to demonstrate its applicability. The proposed measure is evaluated using the decline rate of the network efficiency and the kriging error. The results suggest that WDB effectively quantifies the importance of rain gauges, although the benefits of the method need to be investigated in more detail.
Many institutions struggle to tap into the potential of their large archives of radar reflectivity: these data are often affected by miscalibration, yet the bias is typically unknown and temporally volatile. Still, relative calibration techniques can be used to correct the measurements a posteriori. For that purpose, the usage of spaceborne reflectivity observations from the Tropical Rainfall Measuring Mission (TRMM) and Global Precipitation Measurement (GPM) platforms has become increasingly popular: the calibration bias of a ground radar (GR) is estimated from its average reflectivity difference to the spaceborne radar (SR). Recently, Crisologo et al. (2018) introduced a formal procedure to enhance the reliability of such estimates: each match between SR and GR observations is assigned a quality index, and the calibration bias is inferred as a quality-weighted average of the differences between SR and GR. The relevance of quality was exemplified for the Subic S-band radar in the Philippines, which is greatly affected by partial beam blockage.
The present study extends the concept of quality-weighted averaging by accounting for path-integrated attenuation (PIA) in addition to beam blockage. This extension becomes vital for radars that operate at the C or X band. Correspondingly, the study setup includes a C-band radar that substantially overlaps with the S-band radar. Based on the extended quality-weighting approach, we retrieve, for each of the two ground radars, a time series of calibration bias estimates from suitable SR overpasses. As a result of applying these estimates to correct the ground radar observations, the consistency between the ground radars in the region of overlap increased substantially. Furthermore, we investigated if the bias estimates can be interpolated in time, so that ground radar observations can be corrected even in the absence of prompt SR overpasses. We found that a moving average approach was most suitable for that purpose, although limited by the absence of explicit records of radar maintenance operations.
Introducing PebbleCounts
(2019)
Grain-size distributions are a key geomorphic metric of gravel-bed rivers. Traditional measurement methods include manual counting or photo sieving, but these are achievable only at the 1–10 ㎡ scale. With the advent of drones and increasingly high-resolution cameras, we can now generate orthoimagery over hectares at millimeter to centimeter resolution. These scales, along with the complexity of high-mountain rivers, necessitate different approaches for photo sieving. As opposed to other image segmentation methods that use a watershed approach, our open-source algorithm, PebbleCounts, relies on k-means clustering in the spatial and spectral domain and rapid manual selection of well-delineated grains. This improves grain-size estimates for complex riverbed imagery, without post-processing. We also develop a fully automated method, PebbleCountsAuto, that relies on edge detection and filtering suspect grains, without the k-means clustering or manual selection steps. The algorithms are tested in controlled indoor conditions on three arrays of pebbles and then applied to 12 × 1 ㎡ orthomosaic clips of high-energy mountain rivers collected with a camera-on-mast setup (akin to a low-flying drone). A 20-pixel b-axis length lower truncation is necessary for attaining accurate grain-size distributions. For the k-means PebbleCounts approach, average percentile bias and precision are 0.03 and 0.09 ψ, respectively, for ∼1.16 mm pixel⁻¹ images, and 0.07 and 0.05 ψ for one 0.32 mm pixel⁻¹ image. The automatic approach has higher bias and precision of 0.13 and 0.15 ψ, respectively, for ∼1.16 mm pixel⁻¹ images, but similar values of −0.06 and 0.05 ψ for one 0.32 mm pixel⁻¹ image. For the automatic approach, only at best 70 % of the grains are correct identifications, and typically around 50 %. PebbleCounts operates most effectively at the 1 ㎡ patch scale, where it can be applied in ∼5–10 min on many patches to acquire accurate grain-size data over 10–100 ㎡ areas. These data can be used to validate PebbleCountsAuto, which may be applied at the scale of entire survey sites (102–104 ㎡ ). We synthesize results and recommend best practices for image collection, orthomosaic generation, and grain-size measurement using both algorithms.
Hydrometeorological hazards caused losses of approximately 110 billion U.S. Dollars in 2016 worldwide. Current damage estimations do not consider the uncertainties in a comprehensive way, and they are not consistent between spatial scales. Aggregated land use data are used at larger spatial scales, although detailed exposure data at the object level, such as openstreetmap.org, is becoming increasingly available across the globe.We present a probabilistic approach for object-based damage estimation which represents uncertainties and is fully scalable in space. The approach is applied and validated to company damage from the flood of 2013 in Germany. Damage estimates are more accurate compared to damage models using land use data, and the estimation works reliably at all spatial scales. Therefore, it can as well be used for pre-event analysis and risk assessments. This method takes hydrometeorological damage estimation and risk assessments to the next level, making damage estimates and their uncertainties fully scalable in space, from object to country level, and enabling the exploitation of new exposure data.
Sea surface temperature (SST) patterns can – as surface climate forcing – affect weather and climate at large distances. One example is El Niño-Southern Oscillation (ENSO) that causes climate anomalies around the globe via teleconnections. Although several studies identified and characterized these teleconnections, our understanding of climate processes remains incomplete, since interactions and feedbacks are typically exhibited at unique or multiple temporal and spatial scales. This study characterizes the interactions between the cells of a global SST data set at different temporal and spatial scales using climate networks. These networks are constructed using wavelet multi-scale correlation that investigate the correlation between the SST time series at a range of scales allowing instantaneously deeper insights into the correlation patterns compared to traditional methods like empirical orthogonal functions or classical correlation analysis. This allows us to identify and visualise regions of – at a certain timescale – similarly evolving SSTs and distinguish them from those with long-range teleconnections to other ocean regions. Our findings re-confirm accepted knowledge about known highly linked SST patterns like ENSO and the Pacific Decadal Oscillation, but also suggest new insights into the characteristics and origins of long-range teleconnections like the connection between ENSO and Indian Ocean Dipole.
The ICDP "PaleoVan" drilling campaign at Lake Van, Turkey, provided a long (> 100 m) record of lacustrine subsurface sedimentary microbial cell abundance. After the ICDP campaign at Potrok Aike, Argentina, this is only the second time deep lacustrine cell counts have been documented. Two sites were cored and revealed a strikingly similar cell distribution despite differences in organic matter content and microbial activity. Although shifted towards higher values, cell counts from Lake Potrok Aike, Argentina, reveal very similar distribution patterns with depth. The lacustrine cell count data are significantly different from published marine records; the most probable cause is differences in sedimentary organic matter composition with marine sediments containing a higher fraction of labile organic matter. Previous studies showed that microbial activity and abundance increase centimetres to metres around geologic interfaces. The finely laminated Lake Van sediment allowed studying this phenomenon on the microscale. We sampled at the scale of individual laminae, and in some depth intervals, we found large differences in microbial abundance between the different laminae. This small-scale heterogeneity is normally overlooked due to much larger sampling intervals that integrate over several centimetres. However, not all laminated intervals exhibit such large differences in microbial abundance, and some non-laminated horizons show large variability on the millimetre scale as well. The reasons for such contrasting observations remain elusive, but indicate that heterogeneity of microbial abundance in subsurface sediments has not been taken into account sufficiently. These findings have implications not just for microbiological studies but for geochemistry as well, as the large differences in microbial abundance clearly show that there are distinct microhabitats that deviate considerably from the surrounding layers.