Refine
Has Fulltext
- no (26) (remove)
Year of publication
Is part of the Bibliography
- yes (26)
Keywords
- Climate change (3)
- Statistical techniques (2)
- trend analysis (2)
- Autocorrelation (1)
- Carpathians (1)
- Clausius-Clapeyron (1)
- Clausius-Clapeyron-Gleichung (1)
- Climate change impact (1)
- Climate variability (1)
- Empirical orthogonal functions (1)
High precipitation quantiles tend to rise with temperature, following the so-called Clausius-Clapeyron (CC) scaling. It is often reported that the CC-scaling relation breaks down and even reverts for very high temperatures. In our study, we investigate this reversal using observational climate data from 142 stations across Germany. One of the suggested meteorological explanations for the breakdown is limited moisture supply. Here we argue that, instead, it could simply originate from undersampling. As rainfall frequency generally decreases with higher temperatures, rainfall intensities as dictated by CC scaling are less likely to be recorded than for moderate temperatures. Empirical quantiles are conventionally estimated from order statistics via various forms of plotting position formulas. They have in common that their largest representable return period is given by the sample size. In small samples, high quantiles are underestimated accordingly. The small-sample effect is weaker, or disappears completely, when using parametric quantile estimates from a generalized Pareto distribution (GPD) fitted with L moments. For those, we obtain quantiles of rainfall intensities that continue to rise with temperature.
A conundrum of trends
(2022)
This comment is meant to reiterate two warnings: One applies to the uncritical use of ready-made (openly available) program packages, and one to the estimation of trends in serially correlated time series. Both warnings apply to the recent publication of Lischeid et al. about lake-level trends in Germany.
Indices of oscillatory behavior are conveniently obtained by projecting the fields in question into a phase space of a few (mostly just two) dimensions; empirical orthogonal functions (EOFs) or other, more dynamical, modes are typically used for the projection. If sufficiently coherent and in quadrature, the projected variables simply describe a rotating vector in the phase space, which then serves as the basis for predictions. Using the boreal summer intraseasonal oscillation (BSISO) as a test case, an alternative procedure is introduced: it augments the original fields with their Hilbert transform (HT) to form a complex series and projects it onto its (single) dominant EOF. The real and imaginary parts of the corresponding complex pattern and index are compared with those of the original (real) EOF. The new index explains slightly less variance of the physical fields than the original, but it is much more coherent, partly from its use of future information by the HT. Because the latter is in the way of real-time monitoring, the index can only be used in cases with predicted physical fields, for which it promises to be superior. By developing a causal approximation of the HT, a real-time variant of the index is obtained whose coherency is comparable to the noncausal version, but with smaller explained variance of the physical fields. In test cases the new index compares well to other indices of BSISO. The potential for using both indices as an alternative is discussed.
A digital filter is introduced which treats the problem of predictability versus time averaging in a continuous, seamless manner. This seamless filter (SF) is characterized by a unique smoothing rule that determines the strength of smoothing in dependence on lead time. The rule needs to be specified beforehand, either by expert knowledge or by user demand. As a result, skill curves are obtained that allow a predictability assessment across a whole range of time-scales, from daily to seasonal, in a uniform manner. The SF is applied to downscaled SEAS5 ensemble forecasts for two focus regions in or near the tropical belt, the river basins of the Karun in Iran and the Sao Francisco in Brazil. Both are characterized by strong seasonality and semi-aridity, so that predictability across various time-scales is in high demand. Among other things, it is found that from the start of the water year (autumn), areal precipitation is predictable with good skill for the Karun basin two and a half months ahead; for the Sao Francisco it is only one month, longer-term prediction skill is just above the critical level.
The literature contains a sizable number of publications where weather types are used to decompose climate shifts or trends into contributions of frequency and mean of those types. They are all based on the product rule, that is, a transformation of a product of sums into a sum of products, the latter providing the decomposition. While there is nothing to argue about the transformation itself, its interpretation as a climate shift or trend decomposition is bound to fail. While the case of a climate shift may be viewed as an incomplete description of a more complex behaviour, trend decomposition indeed produces bogus trends, as demonstrated by a synthetic counterexample with well-defined trends in type frequency and mean. Consequently, decompositions based on that transformation, be it for climate shifts or trends, must not be used.
Comment on "Bias correction, quantile mapping, and downscaling: revisiting the inflation issue"
(2014)
In a recent paper, Maraun describes the adverse effects of quantile mapping on downscaling. He argues that when large-scale GCM variables are rescaled directly to small-scale fields or even station data, genuine small-scale covariability is lost and replaced by uniform variability inherited from the larger scales. This leads to a misrepresentation mainly of areal means and long-term trends. This comment acknowledges the former point, although the argument is relatively old, but disagrees with the latter, showing that grid-size long-term trends can be different from local trends. Finally, because it is partly incorrectly addressed, some clarification is added regarding the inflation issue, stressing that neither randomization nor inflation is free of unverified assumptions.
Two lines of research are combined in this study: first, the development of tools for the temporal disaggregation of precipitation, and second, some newer results on the exponential scaling of heavy short-term precipitation with temperature, roughly following the Clausius-Clapeyron (CC) relation. Having no extra temperature dependence, the traditional disaggregation schemes are shown to lack the crucial CC-type temperature dependence. The authors introduce a proof-of-concept adjustment of an existing disaggregation tool, the multiplicative cascade model of Olsson, and show that, in principal, it is possible to include temperature dependence in the disaggregation step, resulting in a fairly realistic temperature dependence of the CC type. They conclude by outlining the main calibration steps necessary to develop a full-fledged CC disaggregation scheme and discuss possible applications.
Estimates of present and future extreme sub-hourly rainfall are derived from a daily spatial followed by a sub-daily temporal downscaling, the latter of which incorporates a novel, and crucial, temperature sensitivity. Specifically, daily global climate fields are spatially downscaled to local temperature T and precipitation P, which are then disaggregated to a temporal resolution of 10 min using a multiplicative random cascade model. The scheme is calibrated and validated with a group of 21 station records of 10-min resolution in Germany. The cascade model is used in the classical (denoted as MC) and in the new T-sensitive (MC+) version, which respects local Clausius-Clapeyron (CC) effects such as CC scaling. Extreme P is positively biased in both MC versions. Observed T sensitivity is absent in MC but well reproduced by MC+. Long-term positive trends in extreme sub-hourly P are generally more pronounced and more significant in MC+ than in MC. In units of 10-min rainfall, observed centennial trends in annual exceedance counts (EC) of P > 5 mm are +29% and in 3-yr return levels (RL) +27%. For the RCP4.5-simulated future, higher extremes are projected in both versions MC and MC+: per century, EC increases by 30% for MC and by 83% for MC+; the RL rises by 14% for MC and by 33% for MC+. Because the projected daily P trends are negligible, the sub-daily signal is mainly driven by local temperature.
Extreme Regenereignisse von kurzer Dauer im Bereich von Stunden und darunter rücken aufgrund der dadurch bedingten Schäden durch Sturzfluten und auch wegen ihrer möglichen Intensivierungen durch den anthropogenen Klimawandel immer stärker in den Fokus. Die vorliegende Studie untersucht auf Basis von teilweise sehr langen (> 50 Jahre) und zeitlich hochaufgelösten Zeitreihen (≤ 15 Minuten) mögliche Trends in Starkregenintensitäten für Stationen aus schweizerischen und österreichischen Alpenregionen sowie für das Emscher-Lippe-Gebiet in Nordrhein-Westfalen. Es wird deutlich, dass es eine Zunahme der extremen Niederschlagsintensitäten gibt, welche gut durch die Erwärmung des regionalen Klimas erklärt werden kann: Die Analysen langfristiger Trends der Überschreitungssummen und Wiederkehrniveaus zeigen zwar erhebliche Unsicherheiten, lassen jedoch eine Zunahme in einer Größenordnung von 30 % pro Jahrhundert erkennen. Zudem wird in diesem Beitrag, basierend auf einer "mittleren" Klimasimulation für das 21. Jahrhundert, für ausgewählte Stationen der Emscher-Lippe-Region eine Projektion für extreme Niederschlagsintensitäten in sehr hoher zeitlicher Auflösung beschrieben. Dabei wird ein gekoppeltes räumliches und zeitliches "Downscaling" angewendet, dessen entscheidende Neuerung die Berücksichtigung der Abhängigkeit der lokalen Regenintensität von der Lufttemperatur ist. Dieses Verfahren beinhaltet zwei Schritte: Zuerst werden großräumige Klimafelder in täglicher Auflösung durch Regression mit den Temperatur- und Niederschlagswerten der Stationen statistisch verbunden (räumliches Downscaling). Im zweiten Schritt werden dann diese Stationswerte mithilfe eines sogenannten multiplikativen stochastischen Kaskadenmodells (MC) auf eine zeitliche Auflösung von 10 Minuten disaggregiert (zeitliches Downscaling). Die neuartige, temperatursensitive Variante berücksichtigt zusätzlich die Lufttemperatur als erklärende Variable für die Niederschlagsintensitäten. Dadurch wird der mit einer Erwärmung zu erwartende höhere atmosphärische Feuchtegehalt, welcher sich aus der Clausius-Clapeyron-Beziehung (CC) ergibt, mit in das zeitliche Downscaling einbezogen.
Für die statistische Auswertung der extremen kurzfristigen Niederschläge wurden die oberen Quantile (99,9 %), Überschreitungssummen (ÜS, P > 5 mm) und 3-jährliche Wiederkehrniveaus (WN) einer Dauerstufe von ≤ 15-Minuten betrachtet. Diese Auswahl erlaubt die gleichzeitige Analyse sowohl von Extremwertstatistiken als auch von deren langfristigen Trends; leichte Abweichungen von dieser Wahl beeinflussen die Hauptergebnisse nur unwesentlich. Nur durch die Hinzunahme der Temperatur wird die beobachtete Temperaturabhängigkeit der extremen Quantile (CC-Scaling) gut wiedergegeben. Bei Vergleich von Beobachtungsdaten und Gegenwartssimulationen der Modellkaskade zeigt das temperatursensitive Verfahren konsistente Ergebnisse. Im Vergleich zu den Entwicklungen der letzten Jahrzehnte werden für die Zukunft ähnliche oder sogar noch stärkere Anstiege der extremen Niederschlagsintensitäten projiziert. Dies ist insofern bemerkenswert, als diese anscheinend hauptsächlich durch die örtliche Temperatur bestimmt werden, denn die projizierten Trends der Niederschlags-Tageswerte sind für diese Region vernachlässigbar.