Copernicus
Refine
Has Fulltext
- yes (68)
Document Type
- Postprint (68)
Language
- English (68)
Is part of the Bibliography
- yes (68)
Keywords
- model (12)
- climate-change (10)
- climate (6)
- variability (5)
- buildings (4)
- ice-sheet (4)
- precipitation (4)
- vulnerability (4)
- Europe (3)
- Greenland (3)
Past climatic change can be reconstructed from sedimentary archives by a number of proxies. However, few methods exist to directly estimate hydrological changes and even fewer result in quantitative data, impeding our understanding of the timing, magnitude and mechanisms of hydrological changes.
Here we present a novel approach based on delta H-2 values of sedimentary lipid biomarkers in combination with plant physiological modeling to extract quantitative information on past changes in relative humidity. Our initial application to an annually laminated lacustrine sediment sequence from western Europe deposited during the Younger Dryas cold period revealed relative humidity changes of up to 15% over sub-centennial timescales, leading to major ecosystem changes, in agreement with palynological data from the region. We show that by combining organic geochemical methods and mechanistic plant physiological models on well characterized lacustrine archives it is possible to extract quantitative ecohydrological parameters from sedimentary lipid biomarker delta H-2 data.
The knowledge of the contemporary in situ stress state is a key issue for safe and sustainable subsurface engineering. However, information on the orientation and magnitudes of the stress state is limited and often not available for the areas of interest. Therefore 3-D geomechanical-numerical modelling is used to estimate the in situ stress state and the distance of faults from failure for application in subsurface engineering. The main challenge in this approach is to bridge the gap in scale between the widely scattered data used for calibration of the model and the high resolution in the target area required for the application. We present a multi-stage 3-D geomechanical-numerical approach which provides a state-of-the-art model of the stress field for a reservoir-scale area from widely scattered data records. Therefore, we first use a large-scale regional model which is calibrated by available stress data and provides the full 3-D stress tensor at discrete points in the entire model volume. The modelled stress state is used subsequently for the calibration of a smaller-scale model located within the large-scale model in an area without any observed stress data records. We exemplify this approach with two-stages for the area around Munich in the German Molasse Basin. As an example of application, we estimate the scalar values for slip tendency and fracture potential from the model results as measures for the criticality of fault reactivation in the reservoir-scale model. The modelling results show that variations due to uncertainties in the input data are mainly introduced by the uncertain material properties and missing S-Hmax magnitude estimates needed for a more reliable model calibration. This leads to the conclusion that at this stage the model's reliability depends only on the amount and quality of available stress information rather than on the modelling technique itself or on local details of the model geometry. Any improvements in modelling and increases in model reliability can only be achieved using more high-quality data for calibration.
The polar and subtropical jet streams are strong upper-level winds with a crucial influence on weather throughout the Northern Hemisphere midlatitudes. In particular, the polar jet is located between cold arctic air to the north and warmer subtropical air to the south. Strongly meandering states therefore often lead to extreme surface weather.
Some algorithms exist which can detect the 2-D (latitude and longitude) jets' core around the hemisphere, but all of them use a minimal threshold to determine the subtropical and polar jet stream. This is particularly problematic for the polar jet stream, whose wind velocities can change rapidly from very weak to very high values and vice versa.
We develop a network-based scheme using Dijkstra's shortest-path algorithm to detect the polar and subtropical jet stream core. This algorithm not only considers the commonly used wind strength for core detection but also takes wind direction and climatological latitudinal position into account. Furthermore, it distinguishes between polar and subtropical jet, and between separate and merged jet states.
The parameter values of the detection scheme are optimized using simulated annealing and a skill function that accounts for the zonal-mean jet stream position (Rikus, 2015). After the successful optimization process, we apply our scheme to reanalysis data covering 1979-2015 and calculate seasonal-mean probabilistic maps and trends in wind strength and position of jet streams.
We present longitudinally defined probability distributions of the positions for both jets for all on the Northern Hemisphere seasons. This shows that winter is characterized by two well-separated jets over Europe and Asia (ca. 20 degrees W to 140 degrees E). In contrast, summer normally has a single merged jet over the western hemisphere but can have both merged and separated jet states in the eastern hemisphere.
With this algorithm it is possible to investigate the position of the jets' cores around the hemisphere and it is therefore very suitable to analyze jet stream patterns in observations and models, enabling more advanced model-validation.
Extreme weather events are likely to occur more often under climate change and the resulting effects on ecosystems could lead to a further acceleration of climate change. But not all extreme weather events lead to extreme ecosystem response. Here, we focus on hazardous ecosystem behaviour and identify coinciding weather conditions. We use a simple probabilistic risk assessment based on time series of ecosystem behaviour and climate conditions. Given the risk assessment terminology, vulnerability and risk for the previously defined hazard are estimated on the basis of observed hazardous ecosystem behaviour.
We apply this approach to extreme responses of terrestrial ecosystems to drought, defining the hazard as a negative net biome productivity over a 12-month period. We show an application for two selected sites using data for 1981-2010 and then apply the method to the pan-European scale for the same period, based on numerical modelling results (LPJmL for ecosystem behaviour; ERA-Interim data for climate).
Our site-specific results demonstrate the applicability of the proposed method, using the SPEI to describe the climate condition. The site in Spain provides an example of vulnerability to drought because the expected value of the SPEI is 0.4 lower for hazardous than for non-hazardous ecosystem behaviour. In northern Germany, on the contrary, the site is not vulnerable to drought because the SPEI expectation values imply wetter conditions in the hazard case than in the non-hazard case.
At the pan-European scale, ecosystem vulnerability to drought is calculated in the Mediterranean and temperate region, whereas Scandinavian ecosystems are vulnerable under conditions without water shortages. These first model- based applications indicate the conceptual advantages of the proposed method by focusing on the identification of critical weather conditions for which we observe hazardous ecosystem behaviour in the analysed data set. Application of the method to empirical time series and to future climate would be important next steps to test the approach.
In recent decades, the Greenland Ice Sheet has been losing mass and has thereby contributed to global sea-level rise. The rate of ice loss is highly relevant for coastal protection worldwide. The ice loss is likely to increase under future warming. Beyond a critical temperature threshold, a meltdown of the Greenland Ice Sheet is induced by the self-enforcing feedback between its lowering surface elevation and its increasing surface mass loss: the more ice that is lost, the lower the ice surface and the warmer the surface air temperature, which fosters further melting and ice loss. The computation of this rate so far relies on complex numerical models which are the appropriate tools for capturing the complexity of the problem. By contrast we aim here at gaining a conceptual understanding by deriving a purposefully simple equation for the self-enforcing feedback which is then used to estimate the melt time for different levels of warming using three observable characteristics of the ice sheet itself and its surroundings. The analysis is purely conceptual in nature. It is missing important processes like ice dynamics for it to be useful for applications to sea-level rise on centennial timescales, but if the volume loss is dominated by the feedback, the resulting logarithmic equation unifies existing numerical simulations and shows that the melt time depends strongly on the level of warming with a critical slow-down near the threshold: the median time to lose 10% of the present-day ice volume varies between about 3500 years for a temperature level of 0.5 degrees C above the threshold and 500 years for 5 degrees C. Unless future observations show a significantly higher melting sensitivity than currently observed, a complete meltdown is unlikely within the next 2000 years without significant ice-dynamical contributions.
Air pollution is the number one environmental cause of premature deaths in Europe. Despite extensive regulations, air pollution remains a challenge, especially in urban areas. For studying summertime air quality in the Berlin-Brandenburg region of Germany, the Weather Research and Forecasting Model with Chemistry (WRF-Chem) is set up and evaluated against meteorological and air quality observations from monitoring stations as well as from a field campaign conducted in 2014. The objective is to assess which resolution and level of detail in the input data is needed for simulating urban background air pollutant concentrations and their spatial distribution in the Berlin-Brandenburg area. The model setup includes three nested domains with horizontal resolutions of 15, 3 and 1 km and anthropogenic emissions from the TNO-MACC III inventory. We use RADM2 chemistry and the MADE/SORGAM aerosol scheme. Three sensitivity simulations are conducted updating input parameters to the single-layer urban canopy model based on structural data for Berlin, specifying land use classes on a sub-grid scale (mosaic option) and downscaling the original emissions to a resolution of ca. 1 km x 1 km for Berlin based on proxy data including traffic density and population density. The results show that the model simulates meteorology well, though urban 2m temperature and urban wind speeds are biased high and nighttime mixing layer height is biased low in the base run with the settings described above. We show that the simulation of urban meteorology can be improved when specifying the input parameters to the urban model, and to a lesser extent when using the mosaic option. On average, ozone is simulated reasonably well, but maximum daily 8 h mean concentrations are underestimated, which is consistent with the results from previous modelling studies using the RADM2 chemical mechanism. Particulate matter is underestimated, which is partly due to an underestimation of secondary organic aerosols. NOx (NO + NO2) concentrations are simulated reasonably well on average, but nighttime concentrations are overestimated due to the model's underestimation of the mixing layer height, and urban daytime concentrations are underestimated. The daytime underestimation is improved when using downscaled, and thus locally higher emissions, suggesting that part of this bias is due to deficiencies in the emission input data and their resolution. The results further demonstrate that a horizontal resolution of 3 km improves the results and spatial representativeness of the model compared to a horizontal resolution of 15 km. With the input data (land use classes, emissions) at the level of detail of the base run of this study, we find that a horizontal resolution of 1 km does not improve the results compared to a resolution of 3 km. However, our results suggest that a 1 km horizontal model resolution could enable a detailed simulation of local pollution patterns in the Berlin-Brandenburg region if the urban land use classes, together with the respective input parameters to the urban canopy model, are specified with a higher level of detail and if urban emissions of higher spatial resolution are used.
Inventories of individually delineated landslides are a key to understanding landslide physics and mitigating their impact. They permit assessment of area–frequency distributions and landslide volumes, and testing of statistical correlations between landslides and physical parameters such as topographic gradient or seismic strong motion. Amalgamation, i.e. the mapping of several adjacent landslides as a single polygon, can lead to potentially severe distortion of the statistics of these inventories. This problem can be especially severe in data sets produced by automated mapping. We present five inventories of earthquake-induced landslides mapped with different materials and techniques and affected by varying degrees of amalgamation. Errors on the total landslide volume and power-law exponent of the area–frequency distribution, resulting from amalgamation, may be up to 200 and 50%, respectively. We present an algorithm based on image and digital elevation model (DEM) analysis, for automatic identification of amalgamated polygons. On a set of about 2000 polygons larger than 1000 m2, tracing landslides triggered by the 1994 Northridge earthquake, the algorithm performs well, with only 2.7–3.6% incorrectly amalgamated landslides missed and 3.9–4.8% correct polygons incorrectly identified as amalgams. This algorithm can be used broadly to check landslide inventories and allow faster correction by automating the identification of amalgamation.
The oldest ice core records are obtained from the East Antarctic Plateau. Water isotopes are key proxies to reconstructing past climatic conditions over the ice sheet and at the evaporation source. The accuracy of climate reconstructions depends on knowledge of all processes affecting water vapour, precipitation and snow isotopic compositions. Fractionation processes are well understood and can be integrated in trajectory-based Rayleigh distillation and isotope-enabled climate models. However, a quantitative understanding of processes potentially altering snow isotopic composition after deposition is still missing. In low-accumulation sites, such as those found in East Antarctica, these poorly constrained processes are likely to play a significant role and limit the interpretability of an ice core's isotopic composition.
By combining observations of isotopic composition in vapour, precipitation, surface snow and buried snow from Dome C, a deep ice core site on the East Antarctic Plateau, we found indications of a seasonal impact of metamorphism on the surface snow isotopic signal when compared to the initial precipitation. Particularly in summer, exchanges of water molecules between vapour and snow are driven by the diurnal sublimation-condensation cycles. Overall, we observe in between precipitation events modification of the surface snow isotopic composition. Using high-resolution water isotopic composition profiles from snow pits at five Antarctic sites with different accumulation rates, we identified common patterns which cannot be attributed to the seasonal variability of precipitation. These differences in the precipitation, surface snow and buried snow isotopic composition provide evidence of post-deposition processes affecting ice core records in low-accumulation areas.
Brief communication
(2015)
Accelerating climate change and increased economic and environmental interests in permafrost-affected regions have resulted in an acute need for more directed permafrost research. In June 2014, 88 early career researchers convened to identify future priorities for permafrost research. This multidisciplinary forum concluded that five research topics deserve greatest attention: permafrost landscape dynamics, permafrost thermal modeling, integration of traditional knowledge, spatial distribution of ground ice, and engineering issues. These topics underline the need for integrated research across a spectrum of permafrost-related domains and constitute a contribution to the Third International Conference on Arctic Research Planning (ICARP III).
Brief communication
(2016)
In March 2015, a new international blueprint for disaster risk reduction (DRR) was adopted in Sendai, Japan, at the end of the Third UN World Conference on Disaster Risk Reduction (WCDRR, 14-18 March 2015). We review and discuss the agreed commitments and targets, as well as the negotiation leading the Sendai Framework for DRR (SF-DRR) and discuss briefly its implication for the later UN-led negotiations on sustainable development goals and climate change.