•Decomposition-based hybrid wind speed forecasting model using bidirectional LSTM.•Decomposition techniques employed to denoise wind speed data are EMD, EEMD and EWT.•Extensive evaluation in terms of ...accuracy and stability on two different datasets.•Skip connections to enable training of deep networks for enhanced performance.•Data denoising and skip connections significantly improve forecasting accuracy.
The goal of sustainable development can be attained by the efficient management of renewable energy resources. Wind energy is attracting attention worldwide due to its renewable and sustainable nature. Accurate wind speed prediction is essential for the stable functioning of wind turbines to generate wind power. However, the flexible and intermittent nature of wind speed makes accurate wind speed forecasting a challenging task. The proposed wind speed forecasting framework combines the features of various data decomposition techniques and Bidirectional Long Short Term Memory (BiDLSTM) networks. Presently, Data Decomposition models such as the Wavelet Transform are extensively employed for wind speed forecasting to improve the accuracy of the forecasting models. Hence, in this paper, various data decomposition techniques that can denoise the signal are investigated and applied to partition the input time series into several high and low-frequency signals. The data decomposition methods, namely, Wavelet Transform, Empirical Model Decomposition, Ensemble Empirical Mode Decomposition, and Empirical Wavelet Transform, have been applied to denoise the dataset. The low and high-frequency sub-series are forecasted separately using Bidirectional LSTM networks, and the forecasting outcomes of low and high-frequency signals are aggregated to get the final forecasting results. The empirical results establish that the proposed EWT- based hybrid model outperforms other decomposition-based models in accuracy and stability. The performance of the EWT-BiDLSTM model is further compared with Bidirectional LSTM networks with skip connections. The experimental results substantiate that the proposed decomposition-based hybrid deep BiDLSTM models with skip connections exhibit better prediction accuracy than other models.
Accurate wind speed forecasting (WSF) is important for effectively harnessing wind energy with clean and sustainable energy benefits. Therefore, this study develops different models established ...through the use of correlation analysis (CA) and decomposition techniques, Harris hawks optimization algorithm (HHO), and S2S (sequence2sequence) based spatial and temporal attention (STAt-S2S) for effective WSF. First, the CA selects variables of significant correlation with the wind speed data. In the next stage, improved complete ensemble empirical mode decomposition with additive noise (ICEEMDAN) and discrete wavelet transform with maximum overlap (MODWT) techniques are employed to decompose the components having significant correlation. Afterwards, HHO selects suitable features from the decomposed data. Finally, STA-S2S extracts spatial, temporal features and performs forecasting. The CA-ICEEMDAN–HHO–STAt-S2S and CA-ICEEMDAN-STAt-S2S models reveal better forecasting outcomes over the other standalone and hybrid foresting models. The RMSE, MAE, and sMAPE values presented by CA-ICEEMDAN-STAt-S2S are 0.639 m/s, 0.474 m/s and 15.710 m/s with NSE of 0.922. The lowest error values with the highest efficiency values of ICEEMDAN, HHO, and STAt-S2S-based hybrid models corroborate the feasibility of these models for WSF with equal applicability for similar time series applications.
Display omitted
•Multivariate meteorological data is utilized for wind speed forecasting.•MODWT and ICEEMDAN decompose data to reduce nonlinearity and nonstationary.•Harris hawks optimization selects suitable decomposed subcomponents.•Spatiotemporal attention extracts spatial and temporal features and Sequence2Sequence framework performs forecasting.•CA-ICEEMDAN–HHO–STAt-S2S and CA-ICEEMDAN-STAt-S2S models reveal better results than the other developed models.
As the demand for renewable energy is increasing, wind speed forecasting (WSF) becomes increasingly popular and essential for wind power generation. Several methods have been developed in the ...literature for WSF. However, WSF accuracy is challenging to obtain due to its non-stationary and non-linear character. To overcome this issue, this study suggests a novel integrated forecasting model for WSF. In the majority of cases, decomposition techniques are essential for removing noise and extracting patterns from non-stationary and non-linear wind speed time series. Among these decomposition techniques, the most widely used decomposition methods based on frequency domain theory are the Fourier decomposition method (FDM), the empirical wavelet transform (EWT), and the variational mode decomposition (VMD). However, FDM decomposition gives inconsistent results, EWT has mixed problems, and VMD is unsuitable for non-stationary time series. Thus, a novel combined model based on empirical Fourier decomposition (EFD) techniques, long short-term memory (LSTM) neural networks, and the Grey Wolf optimizer (GWO) is proposed in this paper. Initially, the wind speed time series is decomposed using EFD into a set of sub-series and a residual, which is a stationary sub-series that can be easily modelled by a recurrent neural network (RNN). Further, the matching LSTM is used to forecast each sub-series and residual, while the GWO optimizes the sub-series estimated output, resulting in improved prediction precision and stability. In order to assess the model performance, five wind speed datasets from various places in India are employed. The findings show that the proposed model is the top performing model among the other ten models; The average statistical score obtained from all datasets reduces 11.01% in the mean absolute error (MAE), 10.68% in the root mean square error (RMSE) and 10.73% in the mean absolute percentage error (MAPE). Further, regularity conditions for universal consistency of proposed model is also proved mathematically.
•A novel hybrid EFD-LSTM-GWO architecture for wind speed forecasting.•Resolving the mixed-mode, inconsistent, and non-stationary challenges.•Analysing the proposed model’s universal consistency.•Outperformance of the proposed algorithm over other existing models.
•A decomposition-based teleconnection framework is proposed by using multiple decomposition and teleconnection analysis.•Relationships are quantified between teleconnection patterns and streamflow ...components.•Influence of nonstationary oscillation processes of the teleconnections on streamflow components are differentiated.
Hydrometeorological teleconnections play a vital role in hydrological processes, as they can reflect the influence of large-scale atmospheric circulations. Understanding and accessing such teleconnections can help reveal the hydrometeorological phenomenon and its associated mechanisms. Therefore, a decomposition-based teleconnection framework (DBTF) is developed to identify the hydrological processes related to the teleconnections under climate change. It consists of three modules, namely, hydrological series decomposition, teleconnection patterns decomposition, and attribution analysis. Specifically, the streamflows are separated into the stationary and nonstationary components using the SSA decomposition technique. Then, the teleconnection patterns (ENSO, AO, PDO, and Sunspots) are decomposed into multi-layer signals by the EEMD decomposition technique. The multiple complex correspondences of different components from hydrological and climate variables are studied by factorial analysis, wavelet coherence transformation, maximum information coefficient, and the partial autocorrelation coefficients. Basins from the humid, semi-humid and semiarid, and arid climate zones are estimated by the framework. The results show that: (1) high-frequency of teleconnection patterns has the highest correlation with the stationary components of streamflow, while low-frequency and trend have the most considerable association with the nonstationary components; (2) the stationary components of streamflow show more pronounced correlations with the teleconnection patterns than the nonstationary components; (3) teleconnection patterns demonstrate more substantial on the decomposed components than the raw streamflow. The contribution is to provide new insight into mechanisms in the effects of hydrometeorological teleconnections on hydrological variations.
In our research work, we develop the analysis of a noninteger-order model for hepatitis B (HBV) under singular type Caputo fractional-order derivative. We investigated our proposed system for an ...approximate or semi-analytical solution using Laplace transform along with decomposition techniques by Adomian polynomial of nonlinear terms and some perturbation techniques of Homotopy (HPM). The obtained solutions have been compared with each other against some real data by simulation via MATLAB. The graphical simulation in fractional form shows a better general result as compared to integer-order simulation.
•US achieved economic growth compatible with a reduction in carbon emission for 2007-16.•Cobb–Douglas production function and extended Kaya equation are combined to develop decoupling ...technique.•Investment effect and economy structure effect played the most important roles in increasing carbon dioxide emissions.•Energy intensity effect accelerated the decoupling, while investment effect and labor effect decelerated the decoupling.•Energy efficiency, investment patterns and labor force quality improvements should be emphasized in mitigation policies.
Energy-related carbon dioxide (CO2) emissions dropped 12% between 2007 and 2016 in the United States (U.S.), while the gross domestic product (GDP) increased by 19%. empirical data related to the decoupling of carbon emissions from economic growth in the U.S. provides a useful study pilot opportunity and serves as a good example for other countries to learn about CO2 emission mitigation. This study identified the relationship between CO2 emissions and economic growth in the U.S. The goal was to determine if the rigid link between the two can be changed, and identify the potential drivers of this trend. We combined the Cobb–Douglas (C-D) production function and the extended Kaya equation to develop decomposition and decoupling techniques to quantify six potential effects. The results show that the investment effect and economy structure effect played the most important roles in increasing carbon dioxide emissions. In contrast, the energy intensity effect cut carbon emissions in most of the years studied. Strong decoupling and weak decoupling are the main states; the energy intensity effect accelerated the decoupling process. In contrast, the investment effect and labor effect decelerated the decoupling process in recent decades. The study concludes that emission mitigation and decoupling policies should emphasize energy efficiency, investment patterns, and improvements in labor force quality.
Central and Eastern Europe (CEE) have experienced considerable instability in mortality since the 1960s. Long periods of stagnating life expectancy were followed by rapid increases in life expectancy ...and, in some cases, even more rapid declines, before more recent periods of improvement. These trends have been well documented, but to date, no study has comprehensively explored trends in lifespan variation. We improved such analyses by incorporating life disparity as a health indicator alongside life expectancy, examining trends since the 1960s for 12 countries from the region. Generally, life disparity was high and fluctuated strongly over the period. For nearly 30 of these years, life expectancy and life disparity varied independently of each other, largely because mortality trends ran in opposite directions over different ages. Furthermore, we quantified the impact of large classes of diseases on life disparity trends since 1994 using a newly harmonized cause-of-death time series for eight countries in the region. Mortality patterns in CEE countries were heterogeneous and ran counter to the common patterns observed in most developed countries. They contribute to the discussion about life expectancy disparity by showing that expansion/compression levels do not necessarily mean lower/higher life expectancy or mortality deterioration/improvements.
Learning from a multi-class problem has not been an easy task for most of the classifiers, because of multiple issues. In the complex multi-class scenarios, samples of different classes overlap with ...each other by sharing attribute, and hence the visibility of least represented samples decrease even more. Learning from imbalanced data studied extensively in the research community, however, the overlapping issues and the co-occurrence impact of overlapping with data imbalance have received comparatively less attention, even though their joint impact is more thoughtful on classifiers’ performance. In this paper, we introduce a modified SVM, MSVM to use as a base classifier with the AdaBoost ensemble classifier (MSVM-AdB) to enhance the learning capability of the ensemble classifier. To implement the proposed technique, we divide the multi-class dataset into overlapping and non-overlapping region. The overlapping region is further filter into the Critical and less Critical region depending upon their sample contribution in the overlapped region. The MSVM is designed to map the overlapped samples in a higher dimension by modifying the kernel mapping function of the standard SVM by using the mean distance of the Critical region samples. To highlight the learning enhancement of the MSVM-AdB, we use 20 real datasets with varying imbalance ratio and the overlapping degree to compare the significance of the AdaBoost-MSVM with the standard SVM, and AdaBoost with standard base classifiers. Experimental results show the superiority of the MSVM-AdB on a collection of benchmark datasets to its standard counterpart classifiers.
•The new scheme determines prediction uncertainty to consider the value of forecasting.•The signal decomposition technique reveals the stochastic characteristics of sequences.•The single-model weight ...of the ensemble model is determined along with its ability.
Precipitation affects the generation of runoff and concentration of water resources in basins. The randomness of precipitation contributes to the difficulty and uncertainty of forecasting it. To improve precipitation forecasting accuracy and account for this uncertainty, a new scheme for probabilistic precipitation forecasting is proposed. In the scheme, first, signal decomposition techniques (complete ensemble empirical mode decomposition with adaptive noise) are used to decompose original precipitation series into subsequences. Second, empirical approaches (time series analysis model, grey self-memory model and long-short-term memory) are used to produce a quantitative precipitation forecast. Third, an ensemble model is used to assemble the outputs of empirical approaches, whose weights are determined by the Adaptive Metropolis-Markov Chain Monte Carlo algorithm (AM-MCMC). The AM-MCMC is adopted to produce a large number of weights for single models in an ensemble model. The quantitative forecasting (prediction) and its confidence interval at a given probability (90%) are obtained by multiplying the single-model predictions by the mean and the confidence interval of the weights assigned to those predictions, respectively. In this study, the annual precipitation (single annual value of each year) is adopted to test the performance of the new scheme. The precipitation of the forecast year is obtained from the precipitation forecast of the previous p years (p is the autocorrelation order of annual precipitation series). The results show that the new scheme for probabilistic forecasting for precipitation has better forecasting accuracy than single-model predictions; the RMSE is less than 139, and the MARE is less than 8.99%. Moreover, the new scheme for probabilistic forecasting for precipitation gets great probabilistic metrics, the CRPS ranges from 0.009 to 0.036, the reliability ranges from 0.001 to 0.008, and sharpness ranges from 24 to 77.
Technological advancements in healthcare, production, automobile, and aviation industries have shifted working styles from manual to automatic. This automation requires smart, intellectual, and safe ...machinery to develop an accurate and efficient brain–computer interface (BCI) system. However, developing such BCI systems requires effective processing and analysis of human physiology. Electroencephalography (EEG) is one such technique that provides a low-cost, portable, non-invasive, and safe solution for BCI systems. However, the non-stationary and nonlinear nature of EEG signals makes it difficult for experts to perform accurate subjective analyses. Hence, there is an urgent need for the development of automatic mental state detection. This paper presents the classification of three mental states using an ensemble of the tunable Q wavelet transform, the multilevel discrete wavelet transform, and the flexible analytic wavelet transform. Various features are extracted from the subbands of EEG signals during focused, unfocused, and drowsy states. Separate and fused features from ensemble decomposition are classified using an optimized ensemble classifier. Our analysis shows that the fusion of features results in a dimensionality reduction. The proposed model obtained the highest accuracies of 92.45% and 97.8% with ten-fold cross-validation and the iterative majority voting technique. The proposed method is suitable for real-time mental state detection to improve BCI systems.