The fractal or Hausdorff dimension is a measure of roughness (or smoothness) for time series and spatial data. The graph of a smooth, differentiable surface indexed in ℝ d has topological and fractal ...dimension d. If the surface is nondifferentiable and rough, the fractal dimension takes values between the topological dimension, d, and d + 1. We review and assess estimators of fractal dimension by their large sample behavior under infill asymptotics, in extensive finite sample simulation studies, and in a data example on arctic sea-ice profiles. For time series or line transect data, boxcount, Hall-Wood, semi-periodogram, discrete cosine transform and wavelet estimators are studied along with variation estimators with power indices 2 (variogram) and 1 (madogram), all implemented in the R package fractaldim. Considering both efficiency and robustness, we recommend the use of the madogram estimator, which can be interpreted as a statistically more efficient version of the Hall–Wood estimator. For two-dimensional lattice data, we propose robust transect estimators that use the median of variation estimates along rows and columns. Generally, the link between power variations of index p > 0 for stochastic processes, and the Hausdorff dimension of their sample paths, appears to be particularly robust and inclusive when p = 1.
A Wavelet Perspective on the Allan Variance Percival, Donald B.
IEEE transactions on ultrasonics, ferroelectrics, and frequency control,
2016-April, 2016-Apr, 2016-4-00, 20160401, Letnik:
63, Številka:
4
Journal Article
Recenzirano
The origins of the Allan variance trace back 50 years ago to two seminal papers, one by Allan (1966) and the other by Barnes (1966). Since then, the Allan variance has played a leading role in the ...characterization of high-performance time and frequency standards. Wavelets first arose in the early 1980s in the geophysical literature, and the discrete wavelet transform (DWT) became prominent in the late 1980s in the signal processing literature. Flandrin (1992) briefly documented a connection between the Allan variance and a wavelet transform based upon the Haar wavelet. Percival and Guttorp (1994) noted that one popular estimator of the Allan variance-the maximal overlap estimator-can be interpreted in terms of a version of the DWT now widely referred to as the maximal overlap DWT (MODWT). In particular, when the MODWT is based on the Haar wavelet, the variance of the resulting wavelet coefficients-the wavelet variance-is identical to the Allan variance when the latter is multiplied by one-half. The theory behind the wavelet variance can thus deepen our understanding of the Allan variance. In this paper, we review basic wavelet variance theory with an emphasis on the Haar-based wavelet variance and its connection to the Allan variance. We then note that estimation theory for the wavelet variance offers a means of constructing asymptotically correct confidence intervals (CIs) for the Allan variance without reverting to the common practice of specifying a power-law noise type a priori. We also review recent work on specialized estimators of the wavelet variance that are of interest when some observations are missing (gappy data) or in the presence of contamination (rogue observations or outliers). It is a simple matter to adapt these estimators to become estimators of the Allan variance. Finally we note that wavelet variances based upon wavelets other than the Haar offer interesting generalizations of the Allan variance.
The meteorology and oceanography of the southeastern Bering Sea shelf was recently dominated by a multi-year warm event (2000–2005) followed by a multi-year cold event (2007–2010). We put these ...recent events into the context of the 95-year air temperature record from St. Paul Island and with concurrent spatial meteorological fields. For March 2000–2005 the mean air temperature anomaly at St. Paul was 2.1°C above the long-term mean, and for March 2007–2010 the mean temperature anomaly at St. Paul was 4.7°C below the long-term mean. The only multi-year temperature deviations comparable to the first decade of the 2000s are a cold event from 1971 to 1976 followed by a warm event from 1978 to 1983. There was also a short warm event 1935–1937. The temperature transition between warm and cold events in the 1970s and 2000s took two years. While there are theoretical arguments for some physical memory processes in the North Pacific climate system, we cannot rule out that the recent warm and cold events are of a random nature: they are rare in the St. Paul temperature record, they are dominated by North Pacific-wide sea level pressure events rather than local Bering Sea processes, and they are consistent with a red noise model of climate variability. The 1970s transition appears to have an ENSO (El Niño–Southern Oscillation) influence, while the recent events are likely connected to Arctic-wide warming. Evidence provided by the 95-year St. Paul meteorological record reinforces the idea that a red-noise model of climate variability is appropriate for the North Pacific and southeastern Bering Sea. We stress the importance of relatively rare sub-decadal events and shifts, rather than multi-decadal variability associated with the Pacific Decadal Oscillation (PDO). Thus, in the future we can expect large positive and negative excursions in the region that can last for multiple years, but there is as yet little predictability for their timing and duration.
Odanacatib is a potent, selective, and neutral cathepsin K inhibitor which was developed to address the metabolic liabilities of the Cat K inhibitor L-873724. Substituting P1 and modifying the P2 ...side chain led to a metabolically robust inhibitor with a long half-life in preclinical species. Odanacatib was more selective in whole cell assays than the published Cat K inhibitors balicatib and relacatib. Evaluation in dermal fibroblast culture showed minimal intracellular collagen accumulation relative to less selective Cat K inhibitors.
To characterize patterns of depressed mood during the menopausal transition (MT) in relation to age and MT-related factors and to assess the contribution of factors related to depressed mood at ...earlier points in the life span.
Women (N = 508) were recruited from 1990 to 1992 from multiethnic neighborhoods and followed annually through 2005: 302 met the eligibility criteria for analyses reported here. The Center for Epidemiologic Studies Depression scale (CES-D) and a menstrual calendar were completed annually throughout the study. A subset of women provided a first morning voided urine specimen from 1997 through 2005. Urine samples were assayed for estrone glucuronide, follicle-stimulating hormone, testosterone, and cortisol. Mixed effects modeling was used to identify changes in CES-D scores over time, including the relationship to age, MT-related factors, and factors related to depression at other points in the life span (postpartum depression/blues, life stress, or family history of clinical depression).
Age was modestly and negatively related to CES-D scores, but MT stage alone was not, except that the late MT stage was significantly related to depressed mood. Hot flash activity, life stress, family history of depression, history of "postpartum blues," sexual abuse history, body mass index, and use of antidepressants were also individually related to depressed mood; the hormonal assays and age of entry into and duration of late MT stage were unrelated.
Although women in the late MT stage are vulnerable to depressed mood, factors that account for depressed mood earlier in the life span continue to have an important influence and should be considered in studies of etiology and therapeutics.
The ability to accurately forecast potential hazards posed to coastal communities by tsunamis generated seismically in both the near and far field requires knowledge of so-called source coefficients, ...from which the strength of a tsunami can be deduced. Seismic information alone can be used to set the source coefficients, but the values so derived reflect the dynamics of movement at or below the seabed and hence might not accurately describe how this motion is manifested in the overlaying water column. We describe here a method for refining source coefficient estimates based on seismic information by making use of data from Deep-ocean Assessment and Reporting of Tsunamis (DART
) buoys (tsunameters). The method involves using these data to adjust precomputed models via an inversion algorithm so that residuals between the adjusted models and the DART
data are as small as possible in a least squares sense. The inversion algorithm is statistically based and hence has the ability to assess uncertainty in the estimated source coefficients. We describe this inversion algorithm in detail and apply it to the November 2006 Kuril Islands event as a case study.
There has been considerable recent interest in using wavelets to analyze time series and images that can be regarded as realizations of certain 1-D and 2-D stochastic processes on a regular lattice. ...Wavelets give rise to the concept of the wavelet variance (or wavelet power spectrum), which decomposes the variance of a stochastic process on a scale-by-scale basis. The wavelet variance has been applied to a variety of time series, and a statistical theory for estimators of this variance has been developed. While there have been applications of the wavelet variance in the 2-D context (in particular, in works by Unser in 1995 on wavelet-based texture analysis for images and by Lark and Webster in 2004 on analysis of soil properties), a formal statistical theory for such analysis has been lacking. In this paper, we develop the statistical theory by generalizing and extending some of the approaches developed for time series, thus leading to a large-sample theory for estimators of 2-D wavelet variances. We apply our theory to simulated data from Gaussian random fields with exponential covariances and from fractional Brownian surfaces. We demonstrate that the wavelet variance is potentially useful for texture discrimination. We also use our methodology to analyze images of four types of clouds observed over the southeast Pacific Ocean.
Multiscale analysis of univariate time series has appeared in the literature at an ever increasing rate. Here we introduce the multiscale analysis of covariance between two time series using the ...discrete wavelet transform. The wavelet covariance and wavelet correlation are defined and applied to this problem as an alternative to traditional cross‐spectrum analysis. The wavelet covariance is shown to decompose the covariance between two stationary processes on a scale by scale basis. Asymptotic normality is established for estimators of the wavelet covariance and correlation. Both quantities are generalized into the wavelet cross covariance and cross correlation in order to investigate possible lead/lag relationships. A thorough analysis of interannual variability for the Madden‐Julian oscillation is performed using a 35+ year record of daily station pressure series. The time localization of the discrete wavelet transform allows the subseries, which are associated with specific physical time scales, to be partitioned into both seasonal periods (such as summer and winter) and also according to El Niño‐Southern Oscillation (ENSO) activity. Differences in variance and correlation between these periods may then be firmly established through statistical hypothesis testing. The daily station pressure series used here show clear evidence of increased variance and correlation in winter across Fourier periods of 16–128 days. During warm episodes of ENSO activity, a reduced variance is observed across Fourier periods of 8–512 days for the station pressure series from Truk Island and little or no correlation between station pressure series for the same periods.
A test for isotropy of images modeled as stationary or intrinsically stationary random fields on a lattice is developed. The test is based on the wavelet theory, and can operate on the horizontal and ...vertical scale of choice, or on any combination of scales. Scale is introduced through the wavelet variances (sometimes called as the wavelet power spectrum), which decompose the variance over different horizontal and vertical spatial scales. The method is more general than existing tests for isotropy, since it handles intrinsically stationary random fields as well as second-order stationary fields. The performance of the method is demonstrated on samples from different random fields, and compared with three existing methods. It is competitive with or outperforms existing methods since it consistently rejects close to the nominal level for isotropic fields while having a rejection rate for anisotropic fields comparable with the existing methods in the stationary case, and superior in the intrinsic case. As practical examples, paper density images of handsheets and mammogram images are analyzed.
The wavelet variance is a scale-based decomposition of the process variance for a time series and has been used to analyze, for example, time deviations in atomic clocks, variations in soil ...properties in agricultural plots, accumulation of snow fields in the polar regions and marine atmospheric boundary layer turbulence. We propose two new unbiased estimators of the wavelet variance when the observed time series is ‘gappy,’ i.e., is sampled at regular intervals, but certain observations are missing. We deduce the large sample properties of these estimators and discuss methods for determining an approximate confidence interval for the wavelet variance. We apply our proposed methodology to series of gappy observations related to atmospheric pressure data and Nile River minima.