Reanalysis data and general circulation model outputs typically provide information at a coarse spatial resolution, which cannot directly be used for local impact studies. Downscaling methods have ...been developed to overcome this problem, and to obtain local‐scale information from large‐scale atmospheric variables. The deduction of local‐scale extremes still is a challenge. Here a probabilistic downscaling approach is presented where the cumulative distribution functions (CDFs) of large‐ and local‐scale extremes are linked by means of a transfer function. In this way, the CDF of the local‐scale extremes is obtained for a projection period, and statistical characteristics, like return levels, are inferred. The input series are assumed to be distributed according to an extreme value distribution, the Generalized Pareto distribution (GPD). The GPD parameters are linked to further explanatory variables, hence defining a nonstationary model. The methodology (XCDF‐t) results in a parametric CDF, which is as well a GPD. Realizations generated from this CDF provide confidence bands. The approach is applied to downscale National Centers for Environmental Prediction reanalysis precipitation in winter. Daily local precipitation at five stations in southern France is obtained. The calibration period 1951–1985 is used to infer precipitation over the validation period 1986–1999. The applicability of the approach is verified by using observations, quantile‐quantile plots, and the continuous ranked probability score. The stationary XCDF‐t approach shows good results and outperforms the nonparametric CDF‐t approach or quantile mapping for some stations. The inclusion of covariate information improves results only sometimes; therefore, covariates have to be chosen with care.
Analyzing the behavior of heavy precipitation, high temperatures, and extremes of other environmental variables has become an important research topic both for hydrologists and climatologists. ...Extreme value theory provides a well‐developed mathematical foundation to statistically model excesses above a high threshold. Practitioners often assume that those excesses approximately follow a generalized Pareto distribution. To infer the two parameters of this distribution, a variety of estimations has been proposed and studied. Among them, maximum likelihood estimation offers an elegant way to include covariates, but imposing an explicit form on the parameters dependence. When analyzing large data sets, this procedure can be too slow and sometimes produce aberrant values due to optimization problems. To overcome these drawbacks, a method based on probability weighted moments and Kernel regression is proposed, tested, and applied to a Swiss daily precipitation data set. The method is implemented as a freely available R package.
Key Points
A novel nonparametric approach for climate extremes is proposed
The method is fast and flexible
Simulations and a real application show the potentiality of the method
Recent studies on extreme events have focused on the potential change of their intensity during the 20th century, but their frequency evolution has often been overlooked although its socio‐economic ...impact is equally important. This paper focuses on extreme events of high and low temperatures and their amplitude and frequency changes over the last 60 years in the North Atlantic (NA) region. We analyze the temporal evolution of the amplitude and frequency of extreme events through the parameters of an extreme value distribution applied to NCEP reanalysis for the winter and summer seasons. We examine the relation of the statistics of extremes with greenhouse gas forcing and an atmospheric circulation index and obtain a spatial distribution of the trends of those extreme parameters. We find that the frequency of warm extremes increases over most of the NA while their magnitude does not vary as systematically. Apart from the Labrador Sea and parts of Scandinavia, the features of winter cold extremes exhibit decreasing or no trends.
We present a conditional density model of river runoff given covariate information which includes precipitation at four surrounding stations. The proposed model is nonparametric in the central part ...of the distribution and relies on extreme value theory parametric assumptions for the upper tail of the distribution. From the trained conditional density model, we can compute quantiles of various levels. The median can serve to simulate river runoff, quantiles of level 5% and 95% can be used to form a 90% confidence interval, and, finally, extreme quantiles can estimate the probability of large runoff. The conditional density model is based on a mixture of hybrid Paretos. The hybrid Pareto is built by stitching a truncated Gaussian with a generalized Pareto distribution. The mixture is made conditional by considering its parameters as functions of covariates. A neural network is used to implement those functions. A penalty term on the tail indexes is added to the conditional log likelihood to guide the maximum likelihood estimator toward solutions that are preferred. This alleviates the difficulties encountered with the maximum likelihood estimator of the tail index on small training sets. We evaluate the proposed model on rainfall-runoff data from the Orgeval basin in France. The effect of the tail penalty is further illustrated on synthetic data.
Nonirrigated agriculture on the Iberian Peninsula is regularly affected by dry periods that can cause important losses. This paper focuses on the comparison of the classical Standard Precipitation ...Index (SPI) with a fragility index developed by the multivariate extreme value theory community, which is used to describe monthly precipitation deficits below 30.5 mm (about 1 mm/d) in the Spanish Duero basin. The multivariate extreme value model allows to capture relevant information concerning the dependence structure among extreme precipitation deficits. Maps of those extremal dependence summaries and of loadings of principal components of the SPI provide quantitative information for water management. In addition, jointly analyzing data from several stations improves the inference of uncertainty. Spatial patterns of extremal dependence emerged with respect to orographic features. Most severe dry spells occur in the southeast of the Duero basin. In central plain of the Duero basin, a predominantly agricultural area, a strong fragility index for severity of dry spells is particularly found in eastern regions. Results of the MEVT and SPI analysis point in the same direction. Beyond this, the MEVT assessment gives a quantitative measure of the dependence between stations and regions. Estimates of return periods for extreme dry spell severity are discussed. Deficits below 42.7 mm are also analyzed.
Key Points
Dependence of extr. droughts is affected by dist. and topology, but not solely.
Maps of dependence of extreme events are valuable additional information.
Consideration of drought variability in space improves water management.
Accurate and rapid determination of near-surface wind fields in a complex area (orography, inhomogeneous surface properties) is a challenge for applications like the evaluation of wind energy ...production, the prediction of pollution transport and hazardous conditions for aeronautics and ship navigation, or the estimation of damage to farm plantations, among others. This paper presents a statistical downscaling approach based on generalized additive models that provides accurate, rapid and relatively transparent simulations of local-scale near-surface wind field based on a method calibrated on both large-scale upper air and surface atmospheric fields. Our statistical method is used to downscale near-surface wind components to weather surface stations in southern France from ERA-40 reanalyses between 1991 and 2001. The region of interest is characterized by the presence of major mountain ranges which play a major role in redirecting large-scale circulations making difficult the prediction of local wind. This study compares the performance of our statistical approach with different sets of explanatory variables, to explain the near-surface wind field variability. The performances are interpreted by evaluating the contribution of the explanatory variables in the equations of motion. This approach generates accurate depictions of the local surface wind field, and allows to go one step further in statistical wind speed downscaling. Indeed, it is adapted to explain wind components and not only wind speed and energy in contrast to past studies and it is suited for complex terrain and robust to time averaging in this region.
Different explanations have been proposed as to why the range of climate sensitivity predicted by GCMs has not lessened substantially in the last decades, and subsequently if it can be reduced. One ...such study (Why is climate sensitivity so unpredictable?) addressed these questions using rather simple theoretical considerations and reached the conclusion that reducing uncertainties on climate feedbacks and underlying climate processes will not yield a large reduction in the envelope of climate sensitivity. In this letter, we revisit the premises of this conclusion. We show that it results from a mathematical artifact caused by a peculiar definition of uncertainty used by these authors. Applying standard concepts and definitions of descriptive statistics to the exact same framework of analysis as Roe and Baker, we show that within this simple framework, reducing inter‐model spread on feedbacks does in fact induce a reduction of uncertainty on climate sensitivity, almost proportionally. Therefore, following Roe and Baker assumptions, climate sensitivity is actually not so unpredictable.
The climate system is continuously affected by forcings that add to its inherent variability. Recently, the dominant influence shifted from mostly natural factors to the rapidly increasing ...anthropogenic greenhouse gas and aerosol forcing. Climate change simulations for the 21st and 22nd centuries then employ possible story lines of human socio‐economic development with associated radiative forcing that exclusively explore the potential human influence on climate. None of the scenarios, however, include natural factors that dominated climate variations prior to the large anthropogenic emissions. This leads to a discontinuity at the transition between the historical and the future projection period. Similarly, studies of transient climate variations before the last 1–2 millennia generally use only the well‐known, slowly varying forcings such as orbital or greenhouse forcing derived from ice cores. While past solar irradiance variations can be reasonably estimated from cosmogenic isotope data, no well‐dated, high‐resolution information exists before about A.D. 500 that would allow for an implementation of forcing from explosive volcanism. Here, we present a statistical approach to generate statistically (and geophysically) realistic scenarios of volcanic forcing that are based on the properties of the longest available volcanic forcing series derived from ice cores. The resulting scenarios do not carry direct temporally predictive or hindcast capabilities, but they allow for an appropriate evaluation of natural uncertainty on various timescales. These series can be applied to ensure a seamless integration of an important natural forcing factor for climate change simulations of periods where such forcing is not available.
To simulate multivariate daily time series (minimum and maximum temperatures, global radiation, wind speed, and precipitation intensity), we propose a weather state approach with a multivariate ...closed skew‐normal generator, WACS‐Gen, that is able to accurately reproduce the statistical properties of these five variables. Our weather generator construction takes advantage of two elements. We first extend the classical wet and dry days dichotomy used in most past weather generators to the definition of multiple weather states using clustering techniques. The transitions among weather states are modeled by a first‐order Markov chain. Second, the vector of our five daily variables of interest is sampled, conditionally on these weather states, from a closed skew‐normal distribution. This class of distribution allows us to handle nonsymmetric behaviors. Our method is applied to the 20 years of daily weather measurements from Colmar, France. This example illustrates the advantages of our approach, especially improving the simulation of radiation and wind distributions.