Refine
Has Fulltext
- no (2)
Year of publication
- 2022 (2) (remove)
Document Type
- Article (2) (remove)
Language
- English (2)
Is part of the Bibliography
- yes (2)
Keywords
- sensitivity analysis (2) (remove)
Improving nitrogen (N) status in European water bodies is a pressing issue. N levels depend not only on current but also past N inputs to the landscape, that have accumulated through time in legacy stores (e.g., soil, groundwater).
Catchment-scale N models, that are commonly used to investigate in-stream N levels, rarely examine the magnitude and dynamics of legacy components.
This study aims to gain a better understanding of the long-term fate of the N inputs and its uncertainties, using a legacy-driven N model (ELEMeNT) in Germany's largest national river basin (Weser; 38,450 km(2)) over the period 1960-2015.
We estimate the nine model parameters based on a progressive constraining strategy, to assess the value of different observational data sets.
We demonstrate that beyond in-stream N loading, soil N content and in-stream N concentration allow to reduce the equifinality in model parameterizations.
We find that more than 50% of the N surplus denitrifies (1480-2210 kg ha(-1)) and the stream export amounts to around 18% (410-640 kg ha(-1)), leaving behind as much as around 230-780 kg ha(-1) of N in the (soil) source zone and 10-105 kg ha(-1) in the subsurface.
A sensitivity analysis reveals the importance of different factors affecting the residual uncertainties in simulated N legacies, namely hydrologic travel time, denitrification rates, a coefficient characterizing the protection of organic N in source zone and N surplus input.
Our study calls for proper consideration of uncertainties in N legacy characterization, and discusses possible avenues to further reduce the equifinality in water quality modeling.
In-depth understanding of the potential implications of climate change is required to guide decision- and policy-makers when developing adaptation strategies and designing infrastructure suitable for future conditions. Impact models that translate potential future climate conditions into variables of interest are needed to create the causal connection between a changing climate and its impact for different sectors. Recent surveys suggest that the primary strategy for validating such models (and hence for justifying their use) heavily relies on assessing the accuracy of model simulations by comparing them against historical observations. We argue that such a comparison is necessary and valuable, but not sufficient to achieve a comprehensive evaluation of climate change impact models. We believe that a complementary, largely observation-independent, step of model evaluation is needed to ensure more transparency of model behavior and greater robustness of scenario-based analyses. This step should address the following four questions: (1) Do modeled dominant process controls match our system perception? (2) Is my model's sensitivity to changing forcing as expected? (3) Do modeled decision levers show adequate influence? (4) Can we attribute uncertainty sources throughout the projection horizon? We believe that global sensitivity analysis, with its ability to investigate a model's response to joint variations of multiple inputs in a structured way, offers a coherent approach to address all four questions comprehensively. Such additional model evaluation would strengthen stakeholder confidence in model projections and, therefore, into the adaptation strategies derived with the help of impact models. This article is categorized under: Climate Models and Modeling > Knowledge Generation with Models Assessing Impacts of Climate Change > Evaluating Future Impacts of Climate Change