Time series econometrics is a rapidly evolving field. Particularly, the cointegration revolution has had a substantial impact on applied analysis. Hence, no textbook has managed to cover the full ...range of methods in current use and explain how to proceed in applied domains. This gap in the literature motivates the present volume. The methods are sketched out, reminding the reader of the ideas underlying them and giving sufficient background for empirical work. The treatment can also be used as a textbook for a course on applied time series econometrics. Topics include: unit root and cointegration analysis, structural vector autoregressions, conditional heteroskedasticity and nonlinear and nonparametric time series models. Crucial to empirical work is the software that is available for analysis. New methodology is typically only gradually incorporated into existing software packages. Therefore a flexible Java interface has been created, allowing readers to replicate the applications and conduct their own analyses.
Randomised controlled trials (RCTs) are considered the gold standard when evaluating the causal effects of healthcare interventions. When RCTs cannot be used (e.g. ethically difficult), the ...interrupted time series (ITS) design is a possible alternative. ITS is one of the strongest quasi-experimental designs. The aim of this methodological study was to describe how ITS designs were being used, the design characteristics, and reporting in the healthcare setting.
We searched MEDLINE for reports of ITS designs published in 2015 which had a minimum of two data points collected pre-intervention and one post-intervention. There was no restriction on participants, language of study, or type of outcome. Data were summarised using appropriate summary statistics.
One hundred and sixteen studies were included in the study. Interventions evaluated were mainly programs 41 (35%) and policies 32 (28%). Data were usually collected at monthly intervals, 74 (64%). Of the 115 studies that reported an analysis, the most common method was segmented regression (78%), 55% considered autocorrelation, and only seven reported a sample size calculation. Estimation of intervention effects were reported as change in slope (84%) and change in level (70%) and 21% reported long-term change in levels.
This methodological study identified problems in the reporting of design features and results of ITS studies, and highlights the need for future work in the development of formal reporting guidelines and methodological work.
Objective
Network analysis in psychology has ushered in a potentially revolutionary way of analyzing clinical data. One novel methodology is in the construction of temporal networks, models that ...examine directionality between symptoms over time. This paper provides context for how these models are applied to clinically‐relevant longitudinal data.
Methods
We provide a survey of statistical and methodological issues involved in temporal network analysis, providing a description of available estimation tools and applications for conducting such analyses. Further, we provide supplemental R code and discuss simulations examining temporal networks that vary in sample size, number of variables, and number of time points.
Results
The following packages and software are reviewed: graphicalVAR, mlVAR, gimme, SparseTSCGM, mgm, psychonetrics, and the Mplus dynamic structural equation modeling module. We discuss the utility each procedure has for specific design considerations.
Conclusion
We conclude with notes on resources for estimating these models, emphasizing how temporal networks best approximate network theory.
Randomized controlled trials (RCTs) are the gold standard for making causal inferences, but RCTs are often not feasible in addiction research for ethical and logistic reasons. Observational data from ...real‐world settings have been increasingly used to guide clinical decisions and public health policies. This paper introduces the potential outcomes framework for causal inference and summarizes well‐established causal analysis methods for observational data, including matching, inverse probability treatment weighting, the instrumental variable method and interrupted time‐series analysis with controls. It provides examples in addiction research and guidance and analysis codes for conducting these analyses with example data sets.
A highly comparative, feature-based approach to time series classification is introduced that uses an extensive database of algorithms to extract thousands of interpretable features from time series. ...These features are derived from across the scientific time-series analysis literature, and include summaries of time series in terms of their correlation structure, distribution, entropy, stationarity, scaling properties, and fits to a range of time-series models. After computing thousands of features for each time series in a training set, those that are most informative of the class structure are selected using greedy forward feature selection with a linear classifier. The resulting feature-based classifiers automatically learn the differences between classes using a reduced number of time-series properties, and circumvent the need to calculate distances between time series. Representing time series in this way results in orders of magnitude of dimensionality reduction, allowing the method to perform well on very large data sets containing long time series or time series of different lengths. For many of the data sets studied, classification performance exceeded that of conventional instance-based classifiers, including one nearest neighbor classifiers using euclidean distances and dynamic time warping and, most importantly, the features selected provide an understanding of the properties of the data set, insight that can guide further scientific investigation.
SUMMARY
Uniaxial compression tests with combined acousto-optical monitoring techniques are conducted on flawed granite specimens, with the aim of investigating the fracture-related acoustic emission ...(AE) event rate characteristics at the unstable cracking phase in flawed rocks. The interevent time (IET) function F(τ) is adopted to interpret the AE time-series from damage stress (σcd) to ultimate failure, and photographic data are used to evaluate unstable cracking behaviours in flawed granite. The results show that a high AE event rate is always registered but intermittently interrupted by macrofracturing at the unstable cracking phase. The reversed U-shaped curve relation between the AE event rate and the loading time is documented in unstable flawed granite for the first time. The acoustic quiescence has a mechanismic linkage and quantitative correlation with stress drop, and this synchronous acousto-mechanical behaviour is a typical result of the initiation, growth and coalescence of macrocracks initiated from the flaw tips. Moreover, the reactivation and intensification of fracture process zones (FPZs) by increasing loads are the dominant mechanism triggering unstable crack growth in flawed granite.
With the increasing load requirements and the sophistication of power stations, knowing in advance about the electrical load not only at short-term periods such as hours or couple of days but also ...over the longer-term periods such as weeks and months is indispensable for a range of benefits such as important technical and economic impacts. Traditional methods such as ARMA, SARIMA, and ARMAX have been used for decades. In recent years, the artificial intelligence (AI) techniques such as neural networks and deep learning are emerging in the field of time series analysis. Towards this end, the artificial neural networks (ANN) and recurrent neural networks (RNN) are being explored and have shown promises in much better forecasting as compared to traditional methods. Long short-term memory (LSTM) networks are a special kind of RNN that have the capabilities to learn the long-term dependencies. In this work, we have picked up an electrical load data with exogenous variables including temperature, humidity, and wind speed. The data is used to train the LSTM network. For a fair comparison, the data is also used in traditional methods to model the load time series. The trained LSTM network and the developed models are then used to forecast over the horizons of 24 hours, 48 hours, 7 days and 30 days. The forecasts generated by the LSTM are compared with the results of traditional methods using RMSE and MAPE for all the forecast horizons. The results of a number of experiments show that the LSTM based forecast is better than other methods and have the potential to further improve the accuracies of forecasts.
Summary
A novel, multidimensional small baseline subset (MSBAS) methodology is presented for integration of multiple interferometric synthetic aperture radar (InSAR) data sets for computation of 2- ...or 3-D time-series of deformation. The proposed approach allows the combination of all possible air-borne and space-borne SAR data acquired with different acquisition parameters, temporal and spatial sampling and resolution, wave-band and polarization. The produced time-series have improved temporal resolution and can be enhanced by applying either regularization or temporal filtering to remove high-frequency noise. We apply this methodology to map 2003-2010 ground deformation of the Virunga Volcanic Province (VVP), North Kivu, Democratic Republic of Congo. The horizontal and vertical time-series of ground displacement clearly identify lava compaction areas, long-term deformation of Mt Nyamuragira and 2004, 2006 and 2010 pre- and coeruptive deformation. Providing that enough SAR data is available, the method opens new opportunities for detecting ground motion in the VVP and elsewhere.
Aiming at the high reliability demand of increasingly large and complex supercomputing systems, this paper proposes a multidimensional fusion CBA-net (CNN-BiLSTAM-Attention) fault prediction model ...based on HDBSCAN clustering preprocessing classification data, which can effectively extract and learn the spatial and temporal features in the predecessor fault log. The model can effectively extract and learn the spatial and temporal features from the predecessor fault logs, and has the advantages of high sensitivity to time series features and sufficient extraction of local features, etc. The RMSE of the model for fault occurrence time prediction is 0.031, and the prediction accuracy of node location for fault occurrence is 93% on average, as demonstrated by experiments. The model can achieve fast convergence and improve the fine-grained and accurate fault prediction of large supercomputers.