1 Downscaling precipitation is a difficult challenge for the climate community. We propose and study a new stochastic weather typing approach to perform such a task. In addition to providing accurate ...small and medium precipitation, our procedure possesses built-in features that allow us to model adequately extreme precipitation distributions. First, we propose a new distribution for local precipitation via a probability mixture model of Gamma and Generalized Pareto (GP) distributions. The latter one stems from Extreme Value Theory (EVT). The performance of this mixture is tested on real and simulated data, and also compared to classical rainfall densities. Then our downscaling method, extending the recently developed nonhomogeneous stochastic weather typing approach, is presented. It can be summarized as a three-step program. First, regional weather precipitation patterns are constructed through a hierarchical ascending clustering method. Second, daily transitions among our precipitation patterns are represented by a nonhomogeneous Markov model influenced by large-scale atmospheric variables like NCEP reanalyses. Third, conditionally on these regional patterns, precipitation occurrence and intensity distributions are modeled as statistical mixtures. Precipitation amplitudes are assumed to follow our mixture of Gamma and GP densities. The proposed downscaling approach is applied to 37 weather stations in Illinois and compared to various possible parameterizations and to a direct modeling. Model selection procedures show that choosing one GP distribution shape parameter per pattern for all stations provides the best rainfall representation amongst all tested models. This work highlights the importance of EVT distributions to improve the modeling and downscaling of local extreme precipitations.
We describe a new approach that allows for systematic causal attribution of weather and climate-related events, in near-real time. The method is designed so as to facilitate its implementation at ...meteorological centers by relying on data and methods that are routinely available when numerically forecasting the weather. We thus show that causal attribution can be obtained as a by-product of
data assimilation
procedures run on a daily basis to update numerical weather prediction (NWP) models with new atmospheric observations; hence, the proposed methodology can take advantage of the powerful computational and observational capacity of weather forecasting centers. We explain the theoretical rationale of this approach and sketch the most prominent features of a “data assimilation–based detection and attribution” (DADA) procedure. The proposal is illustrated in the context of the classical three-variable Lorenz model with additional forcing. The paper concludes by raising several theoretical and practical questions that need to be addressed to make the proposal operational within NWP centers.
The French Mediterranean is subject to intense precipitation events occurring mostly in autumn. These can potentially cause flash floods, the main natural danger in the area. The distribution of ...these events follows specific spatial patterns, i.e., some sites are more likely to be affected than others. The peaks‐over‐threshold approach consists in modeling extremes, such as heavy precipitation, by the generalized Pareto (GP) distribution. The shape parameter of the GP controls the probability of extreme events and can be related to the hazard level of a given site. When interpolating across a region, the shape parameter should reproduce the observed spatial patterns of the probability of heavy precipitation. However, the shape parameter estimators have high uncertainty which might hide the underlying spatial variability. As a compromise, we choose to let the shape parameter vary in a moderate fashion. More precisely, we assume that the region of interest can be partitioned into subregions with constant hazard level. We formalize the model as a conditional mixture of GP distributions. We develop a two‐step inference strategy based on probability weighted moments and put forward a cross‐validation procedure to select the number of subregions. A synthetic data study reveals that the inference strategy is consistent and not very sensitive to the selected number of subregions. An application on daily precipitation data from the French Mediterranean shows that the conditional mixture of GPs outperforms two interpolation approaches (with constant or smoothly varying shape parameter).
Key Points
Regional peaks‐over‐threshold formalized as a conditional mixture model
Inference strategy based on probability weighted moments and nonparametric estimators
Selection of the number of subregions with a cross‐validation procedure
Abstract
Analogs are nearest neighbors of the state of a system. By using analogs and their successors in time, one is able to produce empirical forecasts. Several analog forecasting methods have ...been used in atmospheric applications and tested on well-known dynamical systems. Such methods are often used without reference to theoretical connections with dynamical systems. Yet, analog forecasting can be related to the dynamical equations of the system of interest. This study investigates the properties of different analog forecasting strategies by taking local approximations of the system’s dynamics. We find that analog forecasting performances are highly linked to the local Jacobian matrix of the flow map, and that analog forecasting combined with linear regression allows us to capture projections of this Jacobian matrix. Additionally, the proposed methodology allows us to efficiently estimate analog forecasting errors, an important component in many applications. Carrying out this analysis also makes it possible to compare different analog forecasting operators, helping us to choose which operator is best suited depending on the situation. These results are derived analytically and tested numerically on two simple chaotic dynamical systems. The impact of observational noise and of the number of analogs is evaluated theoretically and numerically.
The emergence of clear semantics for causal claims and of a sound logic for causal reasoning is relatively recent, with the consolidation over the past decades of a coherent theoretical corpus of ...definitions, concepts, and methods of general applicability that is anchored into counterfactuals. The latter corpus has proved to be of high practical interest in numerous applied fields (e.g., epidemiology, economics, and social science). In spite of their rather consensual nature and proven efficacy, these definitions and methods are to a large extent not used in detection and attribution (D&A). This article gives a brief overview of the main concepts underpinning the causal theory and proposes some methodological extensions for the causal attribution of weather and climate-related events that are rooted into the latter. Implications for the formulation of causal claims and their uncertainty are finally discussed.
We review work on extreme events, their causes and consequences, by a group of European and American researchers involved in a three-year project on these topics. The review covers theoretical ...aspects of time series analysis and of extreme value theory, as well as of the deterministic modeling of extreme events, via continuous and discrete dynamic models. The applications include climatic, seismic and socio-economic events, along with their prediction. Two important results refer to (i) the complementarity of spectral analysis of a time series in terms of the continuous and the discrete part of its power spectrum; and (ii) the need for coupled modeling of natural and socio-economic systems. Both these results have implications for the study and prediction of natural hazards and their human impacts.
Abstract
Some properties of chaotic dynamical systems can be probed through features of recurrences, also called analogs. In practice, analogs are nearest neighbors of the state of a system, taken ...from a large database called the catalog. Analogs have been used in many atmospheric applications including forecasts, downscaling, predictability estimation, and attribution of extreme events. The distances of the analogs to the target state usually condition the performances of analog applications. These distances can be viewed as random variables, and their probability distributions can be related to the catalog size and properties of the system at stake. A few studies have focused on the first moments of return-time statistics for the closest analog, fixing an objective of maximum distance from this analog to the target state. However, for practical use and to reduce estimation variance, applications usually require not just one but many analogs. In this paper, we evaluate from a theoretical standpoint and with numerical experiments the probability distributions of the
K
shortest analog-to-target distances. We show that dimensionality plays a role on the size of the catalog needed to find good analogs and also on the relative means and variances of the
K
closest analogs. Our results are based on recently developed tools from dynamical systems theory. These findings are illustrated with numerical simulations of well-known chaotic dynamical systems and on 10-m wind reanalysis data in northwest France. Practical applications of our derivations are shown for forecasts of an idealized chaotic dynamical system and for objective-based dimension reduction using the 10-m wind reanalysis data.
Stratospheric sulfate injections from explosive volcanic eruptions are a primary natural climate forcing. Improved statistical models can now capture and simulate dynamical relationships in temporal ...variations of binary data. Leveraging these new techniques, the presented analysis clearly indicates that the number of large eruptions in the most recent records of explosive volcanism cannot be considered to be fully random. Including dynamical dependence in our models improves their ability to reproduce the historical record and thus forms a strong basis for skill in statistical prediction. Possible geophysical mechanisms behind the identified multidecadal variations are discussed, including variations in the observed length of day.
Key Points
Large explosive eruptions influencing the climate appear not to occur randomly over time
Potential driving geophysical mechanisms are discussed, notably length‐of‐day variations
First building block to stochastically simulate volcanic forcing series that could be included in future climate predictions
Europe witnessed unprecedented warmth persisting throughout fall and winter 2006–2007, with only a few cold breaks. Whether this anomaly and recent warming in Europe can be linked to changes in ...atmospheric dynamics is a key question in the climate change prospective. We show that despite the fall/winter atmospheric flow was favorable to warmth, it cannot explain alone such an exceptional anomaly. Observed temperatures remained well above those found for analogue atmospheric circulations in other fall and winter seasons. Such an offset is also found during the last decade and culminates in 2006/2007. These observational results suggest that the main drivers of recent European warming are not changes in regional atmospheric flow and weather regimes frequencies, contrasting with observed changes before 1994.
Reanalysis data and general circulation model outputs typically provide information at a coarse spatial resolution, which cannot directly be used for local impact studies. Downscaling methods have ...been developed to overcome this problem, and to obtain local‐scale information from large‐scale atmospheric variables. The deduction of local‐scale extremes still is a challenge. Here a probabilistic downscaling approach is presented where the cumulative distribution functions (CDFs) of large‐ and local‐scale extremes are linked by means of a transfer function. In this way, the CDF of the local‐scale extremes is obtained for a projection period, and statistical characteristics, like return levels, are inferred. The input series are assumed to be distributed according to an extreme value distribution, the Generalized Pareto distribution (GPD). The GPD parameters are linked to further explanatory variables, hence defining a nonstationary model. The methodology (XCDF‐t) results in a parametric CDF, which is as well a GPD. Realizations generated from this CDF provide confidence bands. The approach is applied to downscale National Centers for Environmental Prediction reanalysis precipitation in winter. Daily local precipitation at five stations in southern France is obtained. The calibration period 1951–1985 is used to infer precipitation over the validation period 1986–1999. The applicability of the approach is verified by using observations, quantile‐quantile plots, and the continuous ranked probability score. The stationary XCDF‐t approach shows good results and outperforms the nonparametric CDF‐t approach or quantile mapping for some stations. The inclusion of covariate information improves results only sometimes; therefore, covariates have to be chosen with care.