This paper defines and studies a new class of non-stationary random processes constructed from discrete non-decimated wavelets which generalizes the Cramér (Fourier) representation of stationary time ...series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable time-localized autocovariance. We illustrate our theory with a pedagogical example based on discrete non-decimated Haar wavelets and also a real medical time series example.
We treat nonparametric stochastic regression using smooth design-adapted wavelets built by means of the lifting scheme. The proposed method automatically adapts to the nature of the regression ...problem, that is, to the irregularity of the design, to data on the interval, and to arbitrary sample sizes (which do not need to be a power of 2). As such, this method provides a uniform solution to the usual criticisms of first-generation wavelet estimators. More precisely, starting from the unbalanced Haar basis orthogonal with respect to the empirical design measure, we use weighted average interpolation to construct biorthogonal wavelets with a higher number of vanishing analyzing moments. We include a lifting step that improves the conditioning through constrained local semiorthogonalization. We propose a wavelet thresholding algorithm and show its numerical performance both on real data and in simulations including white, correlated, and heteroscedastic noise.
We develop a procedure for analyzing multivariate nonstationary time series using the SLEX library (smooth localized complex exponentials), which is a collection of bases, each basis consisting of ...waveforms that are orthogonal and time-localized versions of the Fourier complex exponentials. Under the SLEX framework, we build a family of multivariate models that can explicitly characterize the time-varying spectral and coherence properties. Every model has a spectral representation in terms of a unique SLEX basis. Before selecting a model, we first decompose the multivariate time series into nonstationary components with uncorrelated (nonredundant) spectral information. The best SLEX model is selected using the penalized log energy criterion, which we derive in this article to be the Kullback-Leibler distance between a model and the SLEX principal components of the multivariate time series. The model selection criterion takes into account all of the pairwise cross-correlation simultaneously in the multivariate time series. The proposed SLEX analysis gives results that are easy to interpret, because it is an automatic time-dependent generalization of the classical Fourier analysis of stationary time series. Moreover, the SLEX method uses computationally efficient algorithms and hence is able to provide a systematic framework for extracting spectral features from a massive dataset. We illustrate the SLEX analysis with an application to a multichannel brain wave dataset recorded during an epileptic seizure.
In this paper, we address the situation where we cannot differentiate wavelet-based threshold procedures because their sets of well-estimated functions (maxisets) are not nested. As a generic ...solution, we propose to proceed via a combination of these procedures in order to achieve new procedures which perform better in the sense that the involved maxisets contain the union of the previous ones. Throughout the paper we propose illuminating interpretations of the maxiset results and provide conditions to ensure that this combination generates larger maxisets. As an example, we propose to combine vertical- and horizontal-block thresholding procedures that are already known to perform well. We discuss the limitation of our method, and we check our theoretical results through numerical experiments.
A consistent estimator for the spectral density of a stationary random process can be obtained by smoothing the periodograms across frequency. An important component of smoothing is the choice of the ...span. Lee (1997) proposed a span selector that was erroneously claimed to be unbiased for the mean squared error. The naive use of mean squared error has some important drawbacks in this context because the variance of the periodogram depends on its mean, i.e. the spectrum. We propose a new span selector based on the generalised crossvalidation function derived from the gamma deviance. This criterion, originally developed for use in fitting generalised additive models, utilises the approximate full likelihood of periodograms, which asymptotically behave like independently distributed chi‐squared, i.e. gamma, random variables. The proposed span selector is very simple and easily implemented. Simulation results suggest that the proposed span selector generally outperforms those obtained under a mean squared error criterion.
Factor modelling of a large time series panel has widely proven useful to reduce its cross-sectional dimensionality. This is done by explaining common co-movements in the panel through the existence ...of a small number of common components, up to some idiosyncratic behaviour of each individual series. To capture serial correlation in the common components, a dynamic structure is used as in traditional (uni- or multivariate) time series analysis of second order structure, i.e. allowing for infinite-length filtering of the factors via dynamic loadings. In this paper, motivated from economic data observed over long time periods which show smooth transitions over time in their covariance structure, we allow the dynamic structure of the factor model to be non-stationary over time by proposing a deterministic time variation of its loadings. In this respect we generalize the existing recent work on static factor models with time-varying loadings as well as the classical, i.e. stationary, dynamic approximate factor model. Motivated from the stationary case, we estimate the common components of our dynamic factor model by the eigenvectors of a consistent estimator of the now time-varying spectral density matrix of the underlying data-generating process. This can be seen as a time-varying principal components approach in the frequency domain. We derive consistency of this estimator in a “double-asymptotic” framework of both cross-section and time dimension tending to infinity. The performance of the estimators is illustrated by a simulation study and an application to a macroeconomic data set.
A Wavelet-Based Test for Stationarity Von Sachs, Rainer; Neumann, Michael H.
Journal of time series analysis,
September 2000, Letnik:
21, Številka:
5
Journal Article
Recenzirano
Odprti dostop
We develop a test for stationarity of a time series against the alternative of a time‐varying covariance structure. Using localized versions of the periodogram, we obtain empirical versions of a ...reasonable notion of a time‐varying spectral density. Coefficients with respect to a Haar wavelet series expansion of such a time‐varying periodogram are an indicator of whether there is some deviation from covariance stationarity. We propose a test based on the limit distribution of these empirical coefficients.
In this paper, we apply a new time-frequency spectral estimation method for multichannel data to epileptiform electroencephalography (EEG). The method is based on the smooth localized complex ...exponentials (SLEX) functions which are time-frequency localized versions of the Fourier functions and, hence, are ideal for analyzing nonstationary signals whose spectral properties evolve over time. The SLEX functions are simultaneously orthogonal and localized in time and frequency because they are obtained by applying a projection operator rather than a window or taper. In this paper, we present the Auto-SLEX method which is a statistical method that 1) computes the periodogram using the SLEX transform, 2) automatically segments the signal into approximately stationary segments using an objective criterion that is based on log energy, and 3) automatically selects the optimal bandwidth of the spectral smoothing window. The method is applied to the intracranial EEG from a patient with temporal lobe epilepsy. This analysis reveals a reduction in average duration of stationarity in preseizure epochs of data compared to baseline. These changes begin up to hours prior to electrical seizure onset in this patient.
A macrotile estimation algorithm is introduced to estimate the covariance of locally stationary processes. A macrotile algorithm uses a penalized method to optimize the partition of the space in ...orthogonal subspaces, and the estimation is computed with a projection operator. It is implemented by searching for a best basis among a dictionary of orthogonal bases and by constructing an adaptive segmentation of this basis to estimate the covariance coefficients. The macrotile algorithm provides a consistent estimation of the covariance of locally stationary processes, using a dictionary of local cosine bases. This estimation is computed with a fast algorithm. Macrotile algorithms apply to other estimation problems such as the removal of additive noise in signals. This simpler problem is used as an intuitive guide to better understand the case of covariance estimation. Examples of removal of white noise from sounds illustrate the results.
We propose a new method for analyzing bivariate nonstationary time series. The proposed method is a statistical procedure that automatically segments the time series into approximately stationary ...blocks and selects the span to be used to obtain the smoothed estimates of the time-varying spectra and coherence. It is based on the smooth localized complex exponential (SLEX) transform, which forms a library of orthogonal complex-valued transforms that are simultaneously localized in time and frequency. We show that the smoothed SLEX periodograms are consistent estimators, report simulation results, and apply the method to a two-channel electroencephalogram dataset recorded during an epileptic seizure.