The spatial coherence of a measured variable (e.g. temperature or pressure) is often studied to determine the regions of high variability or to find teleconnections, i.e. correlations between ...specific regions. While usual methods to find spatial patterns, such as Principal Components Analysis (PCA), are constrained by linear symmetries, the dependence of variables such as temperature or pressure at different locations is generally nonlinear. In particular, large deviations from the sample mean are expected to be strongly affected by such nonlinearities. Here we apply a newly developed nonlinear technique (Maxima of Cumulant Function, MCF) for detection of typical spatial patterns that largely deviate from the mean. In order to test the technique and to introduce the methodology, we focus on the El Niño/Southern Oscillation and its spatial patterns. We find nonsymmetric temperature patterns corresponding to El Niño and La Niña, and we compare the results of MCF with other techniques, such as the symmetric solutions of PCA, and the nonsymmetric solutions of Nonlinear PCA (NLPCA). We found that MCF solutions are more reliable than the NLPCA fits, and can capture mixtures of principal components. Finally, we apply Extreme Value Theory on the temporal variations extracted from our methodology. We find that the tails of the distribution of extreme temperatures during La Niña episodes is bounded, while the tail during El Niños is less likely to be bounded. This implies that the mean spatial patterns of the two phases are asymmetric, as well as the behaviour of their extremes.
A heatwave struck Northern Europe in summer 2018. The probability of this event increased with human-induced climate change primarily due to thermodynamic changes.
We model pairwise dependence of temporal maxima, such as annual maxima of precipitation, that have been recorded in space, either on a regular grid or at irregularly spaced locations. The ...construction of our estimators stems from the variogram concept. The asymptotic properties of our pairwise dependence estimators are established through properties of empirical processes. The performance of our approach is illustrated by simulations and by the treatment of a real dataset. In addition to bringing new results about the asymptotic behaviour of copula estimators, the latter being linked to first-order variograms, one main advantage of our approach is to propose a simple connection between extreme value theory and geostatistics.
Statistics of extremes in hydrology Katz, Richard W; Parlange, Marc B; Naveau, Philippe
Advances in water resources,
08/2002, Letnik:
25, Številka:
8
Journal Article
Recenzirano
Odprti dostop
The statistics of extremes have played an important role in engineering practice for water resources design and management. How recent developments in the statistical theory of extreme values can be ...applied to improve the rigor of hydrologic applications and to make such analyses more physically meaningful is the central theme of this paper. Such methodological developments primarily relate to maximum likelihood estimation in the presence of covariates, in combination with either the block maxima or peaks over threshold approaches. Topics that are treated include trends in hydrologic extremes, with the anticipated intensification of the hydrologic cycle as part of global climate change. In an attempt to link downscaling (i.e., relating large-scale atmosphere–ocean circulation to smaller-scale hydrologic variables) with the statistics of extremes, statistical downscaling of hydrologic extremes is considered. Future challenges are reviewed, such as the development of more rigorous statistical methodology for regional analysis of extremes, as well as the extension of Bayesian methods to more fully quantify uncertainty in extremal estimation. Examples include precipitation and streamflow extremes, as well as economic damage associated with such extreme events, with consideration of trends and dependence on patterns in atmosphere–ocean circulation (e.g., El Niño phenomenon).
For a wide range of applications in hydrology and climate studies, the return level is a fundamental quantity to build dykes, propose flood planning policies, and study weather and climate phenomena ...linked to the behavior of the upper tail of a distribution. More precisely,
z
t
is called the return level associated with a given return period
t
if the level
z
t
is expected to be exceeded on average once every
t
years. To estimate this level in the independent and identically distributed setting, Extreme Value Theory (EVT) has been classically used by assuming that exceedances above a high threshold approximately follow a Generalized Pareto distribution (GPD). This approximation is based on an asymptotic argument, but the rate of convergence may be slow in some cases, e.g., Gaussian variables, and the choice of an appropriate threshold difficult. As an alternative, we propose and study simple estimators of lower and upper return level bounds. This approach has several advantages. It works for both small and moderate sample sizes and for discrete and continuous random variables. In addition, no threshold selection procedure is needed. Still, there is a clear link with EVT because our bounds can be viewed as extensions of the concept of the probability weighted moments that has been classically used in hydrology. In particular, some moment conditions have to be satisfied in order to derive the asymptotic properties of our estimators. We apply our methodology to a few simulations and two climate data sets.
The abiotic transformation of catechol and 1-naphthol singly and in mixtures was tested in sterile Tris-HCl buffer with regard to several environmental factors including temperature (7°C, 20°C and ...30°C), lighting conditions, pH (between 7.0 and 8.5) and dissolved oxygen (at partial pressures of 0.0, 220, 2200, 11000 and 22000
Pa).
Irrespective of lighting conditions, catechol autoxidation was confirmed in aerated medium with a rate independent of the presence of 1-naphthol but proportional to the dissolved oxygen concentration, to the pH (its half-disappearance occurred in 24
h at pH 8.5) and, to a lesser extent, to the incubating temperature (at 20°C, 20% disappeared in 10 days at pH 7.0). Under alkaline conditions, the reaction of the anionic form (catecholate) with an equimolar concentration of molecular oxygen (O
2) led presumably to hydrogen peroxide anion (HO
2
−) and coloured polymerization products.
When tested alone, 1-naphthol was not significantly influenced either by lighting conditions, incubating temperature or dissolved oxygen concentration. It was also found to be quite stable with respect to pH, with a 15-fold weaker transformation rate than for catechol at the highest pH used.
When tested in a mixture with catechol, 1-naphthol was found to be involved in a new chemical oxidation reaction catalyzed by catecholate. The transformation of one mole of 1-naphthol consumes four moles of oxygen. In the presence of catechol, the stoichiometry of the 1-naphthol transformation, under the influence of oxygen, suggests the possible formation of 2,5,6,8-tetrahydroxy 1,4-naphthoquinone via Lawsone (2-hydroxy 1,4-naphthoquinone) and naphthopurpurine (2,5,8-trihydroxy 1,4-naphthoquinone) as hypothetic intermediates. This is the first report of the autoxidation of 1-naphthol, catalyzed by catechol, in aqueous solution, in the absence of UV irradiation.
In this study, the authors propose a method for recovering a two-dimensional surface (image) from noisy observations containing significant jumps and discontinuities. The proposed procedure, termed ...as the segmented polynomial wavelet regression (SPWR) algorithm, combines wavelet regression with polynomial extrapolation and segmentation procedures. The bias that might occur on the discontinuous surface is alleviated through a segmentation process, and at the same time, inhomogeneous multi-scale features of the surface are efficiently treated by wavelet regression. The SPWR algorithm enjoys all the benefits of wavelet regression by implicitly detecting the discontinuities, and it is fast and easy to implement. Through a simulation study, it is demonstrated that the proposed method can produce substantially effective results.
How will our estimates of climate uncertainty evolve in the coming years, as new learning is acquired and climate research makes further progress? As a tentative contribution to this question, we ...argue here that the future path of climate uncertainty may itself be quite uncertain, and that our uncertainty is actually prone to increase even though we learn more about the climate system. We term
disconcerting learning
this somewhat counter-intuitive process in which improved knowledge generates higher uncertainty. After recalling some definitions, this concept is connected with the related concept of
negative learning
that was introduced earlier by Oppenheimer et al. (Clim Change 89:155–172,
2008
). We illustrate disconcerting learning on several real-life examples and characterize mathematically certain general conditions for its occurrence. We show next that these conditions are met in the current state of our knowledge on climate sensitivity, and illustrate this situation based on an energy balance model of climate. We finally discuss the implications of these results on the development of adaptation and mitigation policy.