Noise and artifacts are inherent contaminating components and are particularly present in Holter electrocardiogram (ECG) monitoring. The presence of noise is even more significant in long-term ...monitoring (LTM) recordings, as these are collected for several days in patients following their daily activities; hence, strong artifact components can temporarily impair the clinical measurements from the LTM recordings. Traditionally, the noise presence has been dealt with as a problem of non-desirable component removal by means of several quantitative signal metrics such as the signal-to-noise ratio (SNR), but current systems do not provide any information about the true impact of noise on the ECG clinical evaluation. As a first step towards an alternative to classical approaches, this work assesses the ECG quality under the assumption that an ECG has good quality when it is clinically interpretable. Therefore, our hypotheses are that it is possible (a) to create a clinical severity score for the effect of the noise on the ECG, (b) to characterize its consistency in terms of its temporal and statistical distribution, and (c) to use it for signal quality evaluation in LTM scenarios. For this purpose, a database of external event recorder (EER) signals is assembled and labeled from a clinical point of view for its use as the gold standard of noise severity categorization. These devices are assumed to capture those signal segments more prone to be corrupted with noise during long-term periods. Then, the ECG noise is characterized through the comparison of these clinical severity criteria with conventional quantitative metrics taken from traditional noise-removal approaches, and noise maps are proposed as a novel representation tool to achieve this comparison. Our results showed that neither of the benchmarked quantitative noise measurement criteria represent an accurate enough estimation of the clinical severity of the noise. A case study of long-term ECG is reported, showing the statistical and temporal correspondences and properties with respect to EER signals used to create the gold standard for clinical noise. The proposed noise maps, together with the statistical consistency of the characterization of the noise clinical severity, paves the way towards forthcoming systems providing us with noise maps of the noise clinical severity, allowing the user to process different ECG segments with different techniques and in terms of different measured clinical parameters.
Generalization and Regularization for Inverse Cardiac Estimators Melgarejo Meseguer, Francisco Manuel; Everss Villalba, Estrella; Gutierrez-Fernandez-Calvillo, Miriam ...
IEEE transactions on biomedical engineering,
10/2022, Letnik:
69, Številka:
10
Journal Article
Recenzirano
Odprti dostop
Electrocardiographic Imaging (ECGI) aims to estimate the intracardiac potentials noninvasively, hence allowing the clinicians to better visualize and understand many arrhythmia mechanisms. Most of ...the estimators of epicardial potentials use a signal model based on an estimated spatial transfer matrix together with Tikhonov regularization techniques, which works well specially in simulations, but it can give limited accuracy in some real data. Based on the quasielectrostatic potential superposition principle, we propose a simple signal model that supports the implementation of principled out-of-sample algorithms for several of the most widely used regularization criteria in ECGI problems, hence improving the generalization capabilities of several of the current estimation methods. Experiments on simple cases (cylindrical and Gaussian shapes scrutinizing fast and slow changes, respectively) and on real data (examples of torso tank measurements available from Utah University, and an animal torso and epicardium measurements available from Maastricht University, both in the EDGAR public repository) show that the superposition-based out-of-sample tuning of regularization parameters promotes stabilized estimation errors of the unknown source potentials, while slightly increasing the re-estimation error on the measured data, as natural in non-overfitted solutions. The superposition signal model can be used for designing adequate out-of-sample tuning of Tikhonov regularization techniques, and it can be taken into account when using other regularization techniques in current commercial systems and research toolboxes on ECGI.
During the last decades there has been a rapidly growing elderly population and the number of patients with chronic heart-related diseases has exploded. Many of them (such as those with congestive ...heart failure or some types of arrhythmias) require close medical supervision, thus imposing a big burden on healthcare costs in most western economies. Specifically, continuous or frequent Arterial Blood Pressure (ABP) and electrocardiogram (ECG) monitoring are important tools in the follow-up of many of these patients. In this work, we present a novel remote non-ambulatory and clinically validated heart self-monitoring system, which allows ABP and ECG monitoring to effectively identify clinically relevant arrhythmias. The system integrates digital transmission of the ECG and tensiometer measurements, within a patient-comfortable support, easy to recharge and with a multi-function software, all of them aiming to adapt for elderly people. The main novelty is that both physiological variables (ABP and ECG) are simultaneously measured in an ambulatory environment, which to our best knowledge is not readily available in the clinical market. Different processing techniques were implemented to analyze the heart rhythm, including pause detection, rhythm alterations and atrial fibrillation, hence allowing early detection of these diseases. Our results achieved clinical quality both for in-lab hardware testing and for ambulatory scenario validations. The proposed active assisted living (AAL) Sensor-based system is an end-to-end multidisciplinary system, fully connected to a platform and tested by the clinical team from beginning to end.
Human activity recognition poses a significant challenge within active and assisted living (AAL) systems, relying extensively on ubiquitous environmental sensor-based acquisition devices to detect ...user situations in their daily living. Environmental measurement systems deployed indoors yield multiparametric data in heterogeneous formats, which presents a challenge for developing machine learning-based AAL models. We hypothesized that anomaly detection algorithms could be effectively employed to create data-driven models for monitoring home environments and that the complex multiparametric indoor measurements can often be represented by a relatively small number of latent variables generated through manifold learning (MnL) techniques. We examined both linear (principal component analysis) and nonlinear (autoencoders) techniques for generating these latent spaces and the utility of core domain detection techniques for identifying anomalies within the resulting low-dimensional manifolds. We benchmarked this approach using three publicly available data sets (hh105, Aruba, and Tulum) and one proprietary data set (Elioth) for home environmental monitoring. Our results demonstrated the following key findings: 1) nonlinear manifold estimation techniques offer significant advantages in retrieving latent variables when compared to linear techniques; 2) the quality of the reconstruction of the original multidimensional recordings serves as an acceptable indicator of the quality of the generated latent spaces; 3) domain detection identifies regions of normality consistent with typical individual activities in these spaces; and 4) the system effectively detects deviations from typical activity patterns and labels anomalies. This study lays the groundwork for further exploration of enhanced methods for extracting information from MnL data models and their application within the AAL and possibly other sectors.
The use of sentiment analysis methods has increased in recent years across a wide range of disciplines. Despite the potential impact of the development of opinions during political elections, few ...studies have focused on the analysis of sentiment dynamics and their characterization from statistical and mathematical perspectives. In this paper, we apply a set of basic methods to analyze the statistical and temporal dynamics of sentiment analysis on political campaigns and assess their scope and limitations. To this end, we gathered thousands of Twitter messages mentioning political parties and their leaders posted several weeks before and after the 2019 Spanish presidential election. We then followed a twofold analysis strategy: (1) statistical characterization using indices derived from well-known temporal and information metrics and methods -including entropy, mutual information, and the Compounded Aggregated Positivity Index- allowing the estimation of changes in the density function of sentiment data; and (2) feature extraction from nonlinear intrinsic patterns in terms of manifold learning using autoencoders and stochastic embeddings. The results show that both the indices and the manifold features provide an informative characterization of the sentiment dynamics throughout the election period. We found measurable variations in sentiment behavior and polarity across the political parties and their leaders and observed different dynamics depending on the parties' positions on the political spectrum, their presence at the regional or national levels, and their nationalist or globalist aspirations.
Hypertrophic cardiomyopathy, according to its prevalence, is a comparatively common disease related to the risk of suffering sudden cardiac death, heart failure and stroke. This illness is ...characterized by the excessive deposition of collagen among healthy myocardium cells. This situation, which is medically known as fibrosis, constitutes effective conduction obstacles in the myocardium electrical path, and when severe enough, it can be outlined as additional peaks or notches in the QRS, clinically entitled as fragmentation. Nowadays, the fragmentation detection is performed by visual inspection, but the fragmented QRS can be confused with the noise present in the electrocardiogram (ECG). On the other hand, fibrosis detection is performed by magnetic resonance imaging with late gadolinium enhancement, the main drawback of this technique being its cost in terms of time and money. In this work, we propose two automatic algorithms, one for fragmented QRS detection and another for fibrosis detection. For this purpose, we used four different databases, including the subrogated database described in the companion paper and incorporating three additional ones, one compounded by more accurate subrogated ECG signals and two compounded by real and affected subjects as labeled by expert clinicians. The first real-world database contains QRS fragmented records and the second one contains records with fibrosis and both were recorded in Hospital Clínico Universitario Virgen de la Arrixaca (Spain). To deeply analyze the scope of these datasets, we benchmarked several classifiers such as Neural Networks, Support Vector Machines (SVM), Decision Trees and Gaussian Naïve Bayes (NB). For the fragmentation dataset, the best results were 0.94 sensitivity, 0.88 specificity, 0.89 positive predictive value, 0.93 negative predictive value and 0.91 accuracy when using SVM with Gaussian kernel. For the fibrosis databases, more limited accuracy was reached, with 0.47 sensitivity, 0.91 specificity, 0.82 predictive positive value, 0.66 negative predictive value and 0.70 accuracy when using Gaussian NB. Nevertheless, this is the first time that fibrosis detection is attempted automatically from ECG postprocessing, paving the way towards improved algorithms and methods for it. Therefore, we can conclude that the proposed techniques could offer a valuable tool to clinicians for both fragmentation and fibrosis diagnoses support.
Despite the wide literature on R-wave detection algorithms for ECG Holter recordings, the long-term monitoring applications are bringing new requirements, and it is not clear that the existing ...methods can be straightforwardly used in those scenarios. Our aim in this work was twofold: First, we scrutinized the scope and limitations of existing methods for Holter monitoring when moving to long-term monitoring; Second, we proposed and benchmarked a beat detection method with adequate accuracy and usefulness in long-term scenarios. A longitudinal study was made with the most widely used waveform analysis algorithms, which allowed us to tune the free parameters of the required blocks, and a transversal study analyzed how these parameters change when moving to different databases. With all the above, the extension to long-term monitoring in a database of 7-day Holter monitoring was proposed and analyzed, by using an optimized simultaneous-multilead processing. We considered both own and public databases. In this new scenario, the noise-avoid mechanisms are more important due to the amount of noise that exists in these recordings, moreover, the computational efficiency is a key parameter in order to export the algorithm to the clinical practice. The method based on a Polling function outperformed the others in terms of accuracy and computational efficiency, yielding 99.48% sensitivity, 99.54% specificity, 99.69% positive predictive value, 99.46% accuracy, and 0.85% error for MIT-BIH arrhythmia database. We conclude that the method can be used in long-term Holter monitoring systems.
Recent research has proven the existence of statistical relation among fragmented QRS and several highly prevalence diseases, such as cardiac sarcoidosis, acute coronary syndrome, arrythmogenic ...cardiomyopathies, Brugada syndrome, and hypertrophic cardiomyopathy. One out of five hundred people suffer from hypertrophic cardiomyopathies. The relation among the fragmentation and arrhythmias drives the objective of this work, which is to propose a valid method for QRS fragmentation detection. With that aim, we followed a two-stage approach. First, we identified the features that better characterize the fragmentation by analyzing the physiological interpretation of multivariate approaches, such as principal component analysis (PCA) and independent component analysis (ICA). Second, we created an invariant transformation method for the multilead electrocardiogram (ECG), by scrutinizing the statistical distributions of the PCA eigenvectors and of the ICA transformation arrays, in order to anchor the desired elements in the suitable leads in the feature space. A complete database was compounded incorporating real fragmented ECGs, surrogate registers by synthetically adding fragmented activity to real non-fragmented ECG registers, and standard clean ECGs. Results showed that the creation of beat templates together with the application of PCA over eight independent leads achieves 0.995 fragmentation enhancement ratio and 0.07 dispersion coefficient. In the case of ICA over twelve leads, the results were 0.995 fragmentation enhancement ratio and 0.70 dispersion coefficient. We conclude that the algorithm presented in this work constructs a new paradigm, by creating a systematic and powerful tool for clinical anamnesis and evaluation based on multilead ECG. This approach consistently consolidates the inconspicuous elements present in multiple leads onto designated variables in the output space, hence offering additional and valid visual and non-visual information to standard clinical review, and opening the door to a more accurate automatic detection and statistically valid systematic approach for a wide number of applications. In this direction and within the companion paper, further developments are presented applying this technique to fragmentation detection.
Machine learning techniques, more commonly known today as artificial intelligence, are playing an increasingly important role in all aspects of our lives ...
In atrial fibrillation (AF) ablation procedures, it is desirable to know whether a proper disconnection of the pulmonary veins (PVs) was achieved. We hypothesize that information about their ...isolation could be provided by analyzing changes in P-wave after ablation. Thus, we present a method to detect PV disconnection using P-wave signal analysis.
Conventional P-wave feature extraction was compared to an automatic feature extraction procedure based on creating low-dimensional latent spaces for cardiac signals with the Uniform Manifold Approximation and Projection (UMAP) method. A database of patients (19 controls and 16 AF individuals who underwent a PV ablation procedure) was collected. Standard 12-lead ECG was recorded, and P-waves were segmented and averaged to extract conventional features (duration, amplitude, and area) and their manifold representations provided by UMAP on a 3-dimensional latent space. A virtual patient was used to validate these results further and study the spatial distribution of the extracted characteristics over the whole torso surface.
Both methods showed differences between P-wave before and after ablation. Conventional methods were more prone to noise, P-wave delineation errors, and inter-patient variability. P-wave differences were observed in the standard leads recordings. However, higher differences appeared in the torso region over the precordial leads. Recordings near the left scapula also yielded noticeable differences.
P-wave analysis based on UMAP parameters detects PV disconnection after ablation in AF patients and is more robust than heuristic parameterization. Moreover, additional leads different from the standard 12-lead ECG should be used to detect PV isolation and possible future reconnections better.
•Changes in P-waves of ECG signals are observed after pulmonary veins ablation.•P-wave analysis based on UMAP method is more robust than heuristic parameterization.•A plurality of signals from standard 12-lead ECG detects pulmonary veins isolation.•Leads different from standard 12-lead ECG improve P-wave changes detection.