The development of deep learning technology has brought great improvements to the field of time series forecasting. Short sequence time-series forecasting no longer satisfies the current research ...community, and long-term future prediction is becoming the hotspot, which is noted as long sequence time-series forecasting (LSTF). The LSTF has been widely studied in the extant literature, but few reviews of its research development are reported. In this article, we provide a comprehensive survey of LSTF studies with deep learning technology. We propose rigorous definitions of LSTF and summarize the evolution in terms of a proposed taxonomy based on network structure. Next, we discuss three key problems and corresponding solutions from long dependency modeling, computation cost, and evaluation metrics. In particular, we propose a Kruskal–Wallis test based evaluation method for evaluation metrics problems. We further synthesize the applications, datasets, and open-source codes of LSTF. Moreover, we conduct extensive case studies comparing the proposed Kruskal–Wallis test based evaluation method with existing metrics and the results demonstrate the effectiveness. Finally, we propose potential research directions in this rapidly growing field. All resources and codes are assembled and organized under a unified framework that is available online at https://github.com/Masterleia/TSF_LSTF_Compare.
Display omitted
•Long sequence time-series forecasting (LSTF) is defined from two perspectives.•We propose a new taxonomy and give a comprehensive review of LSTF.•A Kruskal–Wallis test based LSTF performance evaluation method is proposed.•Abundant resources of TSF and LSTF are collected including an open-source library.•We summarize four possible future research directions.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
2.
Time-series forecasting with deep learning: a survey Lim, Bryan; Zohren, Stefan
Philosophical transactions of the Royal Society of London. Series A: Mathematical, physical, and engineering sciences,
04/2021, Volume:
379, Issue:
2194
Journal Article
Peer reviewed
Open access
Numerous deep learning architectures have been developed to accommodate the diversity of time-series datasets across different domains. In this article, we survey common encoder and decoder designs ...used in both one-step-ahead and multi-horizon time-series forecasting-describing how temporal information is incorporated into predictions by each model. Next, we highlight recent developments in hybrid deep learning models, which combine well-studied statistical models with neural network components to improve pure methods in either category. Lastly, we outline some ways in which deep learning can also facilitate decision support with time-series data. This article is part of the theme issue 'Machine learning for weather and climate modelling'.
Financial time series forecasting is undoubtedly the top choice of computational intelligence for finance researchers in both academia and the finance industry due to its broad implementation areas ...and substantial impact. Machine Learning (ML) researchers have created various models, and a vast number of studies have been published accordingly. As such, a significant number of surveys exist covering ML studies on financial time series forecasting. Lately, Deep Learning (DL) models have appeared within the field, with results that significantly outperform their traditional ML counterparts. Even though there is a growing interest in developing models for financial time series forecasting, there is a lack of review papers that solely focus on DL for finance. Hence, the motivation of this paper is to provide a comprehensive literature review of DL studies on financial time series forecasting implementation. We not only categorized the studies according to their intended forecasting implementation areas, such as index, forex, and commodity forecasting, but we also grouped them based on their DL model choices, such as Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), and Long-Short Term Memory (LSTM). We also tried to envision the future of the field by highlighting its possible setbacks and opportunities for the benefit of interested researchers.
•We reviewed all searchable articles of deep learning (DL) for financial time series forecasting.•RNN based DL models (LSTM and GRU included) are the most common.•We compared DL models according to their performances in different forecasted asset classes.•To best of our knowledge, this is the first comprehensive DL survey for financial time series forecasting.•We provided current status of DL in financial time series forecasting, also highlighted the future opportunities.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
•A new hybrid ARIMA-ANN method is proposed for time series forecasting.•Our new hybrid method avoids making strong assumptions like existing methods.•The method achieved better forecasting accuracy ...than many existing ARIMA-ANN models.•The performance of the proposed hybrid method is improved by using EMD.
Many applications in different domains produce large amount of time series data. Making accurate forecasting is critical for many decision makers. Various time series forecasting methods exist that use linear and nonlinear models separately or combination of both. Studies show that combining of linear and nonlinear models can be effective to improve forecasting performance. However, some assumptions that those existing methods make, might restrict their performance in certain situations. We provide a new Autoregressive Integrated Moving Average (ARIMA)-Artificial Neural Network (ANN) hybrid method that work in a more general framework. Experimental results show that strategies for decomposing the original data and for combining linear and nonlinear models throughout the hybridization process are key factors in the forecasting performance of the methods. By using these findings, the proposed hybrid method is combined with Empirical Mode Decomposition (EMD) technique which generates more predictable components. We show that our hybrid method with EMD can be an effective way to improve forecasting accuracy obtained by traditional hybrid methods and also any of the individual methods that we used separately.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
In order to improve the accuracy of the stock market prices forecasting, two hybrid forecasting models are proposed in this paper which combine the two kinds of empirical mode decomposition (EMD) ...with the long short-term memory (LSTM). The financial time series is a kind of non-linear and non-stationary random signal, which can be decomposed into several intrinsic mode functions of different time scales by the original EMD and the complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN). To ensure the effect of historical data onto the prediction result, the LSTM prediction models are established for all each characteristic series from EMD and CEEMDAN deposition. The final prediction results are obtained by reconstructing each prediction series. The forecasting performance of the proposed models is verified by linear regression analysis of the major global stock market indices. Compared with single LSTM model, support vector machine (SVM), multi-layer perceptron (MLP) and other hybrid models, the experimental results show that the proposed models display a better performance in one-step-ahead forecasting of financial time series.
•A new hybrid time series forecasting method is established by combining EMD and CEEMDAN algorithm with LSTM neural network.•The forecasting efficiency of financial time series is improved by the model.•The forecasting results of the proposed model are more accurate than other similar models.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Time series forecasting (TSF) is the task of predicting future values of a given sequence using historical data. Recently, this task has attracted the attention of researchers in the area of machine ...learning to address the limitations of traditional forecasting methods, which are time-consuming and full of complexity. With the increasing availability of extensive amounts of historical data along with the need of performing accurate production forecasting, particularly a powerful forecasting technique infers the stochastic dependency between past and future values is highly needed. In this paper, we propose a deep learning approach capable to address the limitations of traditional forecasting approaches and show accurate predictions. The proposed approach is a deep long-short term memory (DLSTM) architecture, as an extension of the traditional recurrent neural network. Genetic algorithm is applied in order to optimally configure DLSTM’s optimum architecture. For evaluation purpose, two case studies from the petroleum industry domain are carried out using the production data of two actual oilfields. Toward a fair evaluation, the performance of the proposed approach is compared with several standard methods, either statistical or soft computing. Using different measurement criteria, the empirical results show that the proposed DLSTM model outperforms other standard approaches.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
In the field of computational science and engineering, solving nonlinear transient problems still poses a challenging task that often requires significant computational resources. This research ...introduces a novel methodology that harnesses the power of cutting-edge Temporal Fusion Transformers (TFTs) to accelerate the solution of such problems in multi-query scenarios (i.e. parameterized problems). At each time step, TFT models, renowned for their time series forecasting capabilities, are combined with dimensionality reduction techniques to efficiently generate initial solutions for nonlinear solvers. Specifically, during the training phase, a reduced set of high-fidelity system solutions is obtained by solving the system of differential equations governing the problem for different parameter instances. Then, dimensionality reduction is applied to create a reduced latent space to simplify the representation of the complex system solutions. Subsequently, TFT models are trained for one-step-ahead forecasting in the latent space, utilizing information from previous states to make accurate predictions about future states. The TFTs’ predictions are fed back to the system as initial guesses at each time step of the solution algorithm and are then guided towards the exact solutions that satisfy equilibrium using Newton–Raphson (NR) iterations. The basic premise of the proposed idea is that having accurate initial predictions will significantly decrease the number of the costly NR-iterations needed in nonlinear dynamic problems, effectively reducing the solution time. In addition, the proposed scheme is able to handle problems with parametric and time-variant forcing forces. A customized TFT architecture is developed that takes as input not only the response history of the system, but also the current loading, and makes informed guesses about the future state of the system. The methodology’s effectiveness is demonstrated in numerical applications that involve high nonlinearity, where the TFT-generated initial solutions resulted in a notable reduction in the number of NR-iterations required for solver convergence. This significant enhancement in computational efficiency holds substantial promise, especially in scenarios involving a multitude of analyses and high iteration demands, with wide-ranging applications across computational mechanics and related fields.
•Methodology to accelerate solving parametric nonlinear PDEs with time-varying terms.•Uses dimensionality reduction to represent solutions in a reduced latent space.•Transformers trained for one-step forecasting in latent space using previous states.•TFT predictions used as initial guesses, reducing iterations for nonlinear solver.•This boost in efficiency benefits parametric problems with high iteration demands.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
•A demand forecasting method based on multi-layer LSTM networks is proposed.•The proposed method improves the forecasting accuracy.•It has strong ability to capture nonlinear patterns in time series ...data.•The empirical results show that the method outperforms other standard techniques.
In a business environment with strict competition among firms, accurate demand forecasting is not straightforward. In this paper, a forecasting method is proposed, which has a strong capability of predicting highly fluctuating demand data. Therefore, in this paper we propose a demand forecasting method based on multi-layer LSTM networks. The proposed method automatically selects the best forecasting model by considering different combinations of LSTM hyperparameters for a given time series using the grid search method. It has the ability to capture nonlinear patterns in time series data, while considering the inherent characteristics of non-stationary time series data. The proposed method is compared with some well-known time series forecasting techniques from both statistical and computational intelligence methods using demand data of a furniture company. These methods include autoregressive integrated moving average (ARIMA), exponential smoothing (ETS), artificial neural network (ANN), K-nearest neighbors (KNN), recurrent neural network (RNN), support vector machines (SVM) and single layer LSTM. The experimental results indicate that the proposed method is superior among the tested methods in terms of performance measures.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
The time series forecasting literature has highlighted the accuracy of hybrid systems that combine statistical linear and Machine Learning (ML) models by modeling the residuals. These systems ...separately model linear and nonlinear patterns aiming to overcome the limitations of using only a single model. This system comprises three phases: linear modeling of the time series, forecasting the residuals using an ML model, and final forecasting through the combination of past phases. Modeling the residuals is challenging because the residuals may present heteroscedasticity, complex nonlinear patterns, and random fluctuations. Hence, specifying a single ML model is a complex task. This work proposes a hybrid system that combines a linear statistical model with an ensemble of ML models to forecast real-world time series. The proposed method employs an ensemble in the phase of modeling the residuals, aiming at: improving the generalization capacity of the system, reducing the risk of selecting an incorrect model, expanding the function space, and increasing the system's accuracy. Moreover, for each time series, a data-driven search is carried out for the parameters of the ensemble that will be the most suitable for that time series. The experimental results show that the proposal attains superior performance and is statistically better than the related systems in the literature.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Contrastive representation learning is crucial in time series analysis as it alleviates the issue of data noise and incompleteness as well as sparsity of supervision signal. However, existing ...contrastive learning frameworks usually focus on intra-temporal features, which fail to fully exploit the intricate nature of time series data. To address this issue, we propose DE-TSMCL, an innovative distillation enhanced framework for long sequence time series forecasting. Specifically, we design a learnable data augmentation mechanism which adaptively learns whether to mask a timestamp to obtain optimized sub-sequences. Then, we propose a contrastive learning task with momentum update to explore inter-sample and intra-temporal correlations of time series to learn the underlying structure feature on the unlabeled time series. Meanwhile, we design a supervised task to learn more robust representations and facilitate the contrastive learning process. Finally, we jointly optimize the above two tasks. By developing model loss from multiple tasks, we can learn effective representations for downstream forecasting task. Extensive experiments, in comparison with state-of-the-arts, well demonstrate the effectiveness of DE-TSMCL, where the maximum improvement can reach to 27.3%. Source code for the algorithm is available at https://github.com/gaohaozhi/DE-TSMCL.
•Proposing a distillation enhanced framework for time series forecasting.•Solving distribution shift problem and noise false positive focusing of contrastive learning method.•Adopting teacher-student paradigm for contrastive learning to enhance model ability.•Experiments on real-world datasets show the superiority of the proposed model.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP