Biosignal measurement and processing is increasingly being deployed in ambulatory situations particularly in connected health applications. Such an environment dramatically increases the likelihood ...of artifacts which can occlude features of interest and reduce the quality of information available in the signal. If multichannel recordings are available for a given signal source, then there are currently a considerable range of methods which can suppress or in some cases remove the distorting effect of such artifacts. There are, however, considerably fewer techniques available if only a single-channel measurement is available and yet single-channel measurements are important where minimal instrumentation complexity is required. This paper describes a novel artifact removal technique for use in such a context. The technique known as ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA) is capable of operating on single-channel measurements. The EEMD technique is first used to decompose the single-channel signal into a multidimensional signal. The CCA technique is then employed to isolate the artifact components from the underlying signal using second-order statistics. The new technique is tested against the currently available wavelet denoising and EEMD-ICA techniques using both electroencephalography and functional near-infrared spectroscopy data and is shown to produce significantly improved results.
Intricate agricultural ecosystems markedly influence the dynamics of organic micropollutants, posing substantial threats to aquatic organisms and human health. This study examined the occurrence and ...distribution of organic micropollutants across soils, ditch sediment, and water within highly intensified farming setups. Using a non-targeted screening method, we identified 405 micropollutants across 10 sampling sites, which mainly included pesticides, pharmaceuticals, industrial chemicals, and personal care products. This inventory comprised emerging contaminants, banned pesticides, and controlled pharmaceuticals that had eluded detection via conventional monitoring. Targeted analysis showed concentrations of 3.99–1021 ng/g in soils, 4.67–2488 ng/g in sediment, and 12.5–9373 ng/L in water, respectively, for Σ40pesticides, Σ8pharmaceuticals, and Σ3industrial chemicals, indicating notable spatial variability. Soil organic carbon content and wastewater discharge were likely responsible for their spatial distribution. Principal component analysis and correlation analysis revealed a potential transfer of micropollutants across the three media. Particularly, a heightened correlation was decerned between soil and sediment micropollutant levels, highlighting the role of sorption processes. Risk quotients surpassed the threshold of 1 for 13–23 micropollutants across the three media, indicating high environmental risks. This study highlights the importance of employing non-targeted and targeted screening in assessing and managing environmental risks associated with micropollutants.
Display omitted
•Non-targeted analysis uncovered 405 micropollutants in agricultural systems.•Emerging contaminants and controlled substances were detected in the three media.•Micropollutant concentrations exhibited significant spatial variations.•A profound transfer of micropollutants was observed across the three media.•Some micropollutants exhibited high risks within agricultural environments.
In this paper, we first study a generalized canonical correlation analysis (CCA)-based fault detection (FD) method aiming at maximizing the fault detectability under an acceptable false alarm rate. ...More specifically, two residual signals are generated for detecting of faults in input and output subspaces, respectively. The minimum covariances of the two residual signals are achieved by taking the correlation between input and output into account. Considering the limited application scope of the generalized CCA due to the Gaussian assumption on the process noises, an FD technique combining the generalized CCA with the threshold-setting based on the randomized algorithm is proposed and applied to the simulated traction drive control system of high-speed trains. The achieved results show that the proposed method is able to improve the detection performance significantly in comparison with the standard generalized CCA-based FD method.
Electricity theft has long been one of the major problems faced by power supply enterprises. To improve the robustness and accuracy of power theft detection, this article explores the method of ...multi-source heterogeneous time series feature fusion and designs a gated cyclic unit network model that adapts to its feature fusion. Firstly, through correlation analysis, it is verified that there is a logical correlation between different time series features and classification features, providing a theoretical basis for feature fusion. Then, an encoder decoder model framework is constructed with an attention mechanism to achieve effective fusion and state detection of user multi-source time series features. The experimental results show that compared to a single data source, the fusion of multi-source features can significantly improve detection performance, and the designed model is superior to the control model. This study provides a reference for constructing an efficient power theft detection system and also provides examples of multi-source heterogeneous feature fusion in related fields.
Fault detection based on canonical correlation analysis (CCA) has received increased attention due to its efficiency in exploring the relationship between input and output. However, traditional CCA ...may generate redundant features in both the input and output projections while maximizing the correlations. In this paper, sparse dynamic canonical correlation analysis (SDCCA) is developed for dealing with the fault detection of dynamic processes. Through posing sparsity in the extraction of features, the interpretability of canonical variates is enhanced attributed to the sparsity of canonical weights. Based on the SDCCA model, the T2 monitoring metric is established for fault detection. Moreover, the upper control limit (UCL) based on T2 monitoring metrics is determined by the kernel density estimation (KDE) method to avoid the violation of the Gaussian assumption. The superiority of the proposed SDCCA‐based fault detection method is illustrated through a comparative study of the Tennessee Eastman process benchmark.
Generalized Canonical Correlation Analysis (GCCA) is an important tool that finds numerous applications in data mining, machine learning, and artificial intelligence. It aims at finding 'common' ...random variables that are strongly correlated across multiple feature representations (views) of the same set of entities. CCA and to a lesser extent GCCA have been studied from the statistical and algorithmic points of view, but not as much from the standpoint of linear algebra. This paper offers a fresh algebraic perspective on GCCA based on a (bi-)linear generative model that naturally captures its essence. It is shown that from a linear algebra point of view, GCCA is tantamount to subspace intersection; and conditions under which the common subspace of the different views is identifiable are provided. A novel GCCA algorithm is proposed based on subspace intersection, which scales up to handle large GCCA tasks. Synthetic as well as real data experiments are provided to showcase the effectiveness of the proposed approach.