Time series econometrics is a rapidly evolving field. Particularly, the cointegration revolution has had a substantial impact on applied analysis. Hence, no textbook has managed to cover the full ...range of methods in current use and explain how to proceed in applied domains. This gap in the literature motivates the present volume. The methods are sketched out, reminding the reader of the ideas underlying them and giving sufficient background for empirical work. The treatment can also be used as a textbook for a course on applied time series econometrics. Topics include: unit root and cointegration analysis, structural vector autoregressions, conditional heteroskedasticity and nonlinear and nonparametric time series models. Crucial to empirical work is the software that is available for analysis. New methodology is typically only gradually incorporated into existing software packages. Therefore a flexible Java interface has been created, allowing readers to replicate the applications and conduct their own analyses.
Randomised controlled trials (RCTs) are considered the gold standard when evaluating the causal effects of healthcare interventions. When RCTs cannot be used (e.g. ethically difficult), the ...interrupted time series (ITS) design is a possible alternative. ITS is one of the strongest quasi-experimental designs. The aim of this methodological study was to describe how ITS designs were being used, the design characteristics, and reporting in the healthcare setting.
We searched MEDLINE for reports of ITS designs published in 2015 which had a minimum of two data points collected pre-intervention and one post-intervention. There was no restriction on participants, language of study, or type of outcome. Data were summarised using appropriate summary statistics.
One hundred and sixteen studies were included in the study. Interventions evaluated were mainly programs 41 (35%) and policies 32 (28%). Data were usually collected at monthly intervals, 74 (64%). Of the 115 studies that reported an analysis, the most common method was segmented regression (78%), 55% considered autocorrelation, and only seven reported a sample size calculation. Estimation of intervention effects were reported as change in slope (84%) and change in level (70%) and 21% reported long-term change in levels.
This methodological study identified problems in the reporting of design features and results of ITS studies, and highlights the need for future work in the development of formal reporting guidelines and methodological work.
Objective
Network analysis in psychology has ushered in a potentially revolutionary way of analyzing clinical data. One novel methodology is in the construction of temporal networks, models that ...examine directionality between symptoms over time. This paper provides context for how these models are applied to clinically‐relevant longitudinal data.
Methods
We provide a survey of statistical and methodological issues involved in temporal network analysis, providing a description of available estimation tools and applications for conducting such analyses. Further, we provide supplemental R code and discuss simulations examining temporal networks that vary in sample size, number of variables, and number of time points.
Results
The following packages and software are reviewed: graphicalVAR, mlVAR, gimme, SparseTSCGM, mgm, psychonetrics, and the Mplus dynamic structural equation modeling module. We discuss the utility each procedure has for specific design considerations.
Conclusion
We conclude with notes on resources for estimating these models, emphasizing how temporal networks best approximate network theory.
Machine Learning (ML) methods have been proposed in the academic literature as alternatives to statistical ones for time series forecasting. Yet, scant evidence is available about their relative ...performance in terms of accuracy and computational requirements. The purpose of this paper is to evaluate such performance across multiple forecasting horizons using a large subset of 1045 monthly time series used in the M3 Competition. After comparing the post-sample accuracy of popular ML methods with that of eight traditional statistical ones, we found that the former are dominated across both accuracy measures used and for all forecasting horizons examined. Moreover, we observed that their computational requirements are considerably greater than those of statistical methods. The paper discusses the results, explains why the accuracy of ML models is below that of statistical ones and proposes some possible ways forward. The empirical results found in our research stress the need for objective and unbiased ways to test the performance of forecasting methods that can be achieved through sizable and open competitions allowing meaningful comparisons and definite conclusions.
Randomized controlled trials (RCTs) are the gold standard for making causal inferences, but RCTs are often not feasible in addiction research for ethical and logistic reasons. Observational data from ...real‐world settings have been increasingly used to guide clinical decisions and public health policies. This paper introduces the potential outcomes framework for causal inference and summarizes well‐established causal analysis methods for observational data, including matching, inverse probability treatment weighting, the instrumental variable method and interrupted time‐series analysis with controls. It provides examples in addiction research and guidance and analysis codes for conducting these analyses with example data sets.
SUMMARY
Uniaxial compression tests with combined acousto-optical monitoring techniques are conducted on flawed granite specimens, with the aim of investigating the fracture-related acoustic emission ...(AE) event rate characteristics at the unstable cracking phase in flawed rocks. The interevent time (IET) function F(τ) is adopted to interpret the AE time-series from damage stress (σcd) to ultimate failure, and photographic data are used to evaluate unstable cracking behaviours in flawed granite. The results show that a high AE event rate is always registered but intermittently interrupted by macrofracturing at the unstable cracking phase. The reversed U-shaped curve relation between the AE event rate and the loading time is documented in unstable flawed granite for the first time. The acoustic quiescence has a mechanismic linkage and quantitative correlation with stress drop, and this synchronous acousto-mechanical behaviour is a typical result of the initiation, growth and coalescence of macrocracks initiated from the flaw tips. Moreover, the reactivation and intensification of fracture process zones (FPZs) by increasing loads are the dominant mechanism triggering unstable crack growth in flawed granite.
With the increasing load requirements and the sophistication of power stations, knowing in advance about the electrical load not only at short-term periods such as hours or couple of days but also ...over the longer-term periods such as weeks and months is indispensable for a range of benefits such as important technical and economic impacts. Traditional methods such as ARMA, SARIMA, and ARMAX have been used for decades. In recent years, the artificial intelligence (AI) techniques such as neural networks and deep learning are emerging in the field of time series analysis. Towards this end, the artificial neural networks (ANN) and recurrent neural networks (RNN) are being explored and have shown promises in much better forecasting as compared to traditional methods. Long short-term memory (LSTM) networks are a special kind of RNN that have the capabilities to learn the long-term dependencies. In this work, we have picked up an electrical load data with exogenous variables including temperature, humidity, and wind speed. The data is used to train the LSTM network. For a fair comparison, the data is also used in traditional methods to model the load time series. The trained LSTM network and the developed models are then used to forecast over the horizons of 24 hours, 48 hours, 7 days and 30 days. The forecasts generated by the LSTM are compared with the results of traditional methods using RMSE and MAPE for all the forecast horizons. The results of a number of experiments show that the LSTM based forecast is better than other methods and have the potential to further improve the accuracies of forecasts.
Summary
A novel, multidimensional small baseline subset (MSBAS) methodology is presented for integration of multiple interferometric synthetic aperture radar (InSAR) data sets for computation of 2- ...or 3-D time-series of deformation. The proposed approach allows the combination of all possible air-borne and space-borne SAR data acquired with different acquisition parameters, temporal and spatial sampling and resolution, wave-band and polarization. The produced time-series have improved temporal resolution and can be enhanced by applying either regularization or temporal filtering to remove high-frequency noise. We apply this methodology to map 2003-2010 ground deformation of the Virunga Volcanic Province (VVP), North Kivu, Democratic Republic of Congo. The horizontal and vertical time-series of ground displacement clearly identify lava compaction areas, long-term deformation of Mt Nyamuragira and 2004, 2006 and 2010 pre- and coeruptive deformation. Providing that enough SAR data is available, the method opens new opportunities for detecting ground motion in the VVP and elsewhere.