Physiological signals such as electroencephalography (EEG), electromyography (EMG), and electrocardiography (ECG) provide valuable clinical information but pose challenges for analysis due to their ...high-dimensional nature. Traditional machine learning techniques, relying on hand-crafted features from fixed analysis windows, can lead to the loss of discriminative information. Recent studies have demonstrated the effectiveness of deep convolutional neural networks (CNNs) for robust automated feature learning from raw physiological signals. However, standard CNN architectures require two-dimensional image data as input. This has motivated research into innovative signal-to-image (STI) transformation techniques to convert one-dimensional time series into images preserving spectral, spatial, and temporal characteristics. This paper reviews recent advances in strategies for physiological signal-to-image conversion and their applications using CNNs for automated processing tasks. A systematic analysis of EEG, EMG, and ECG signal transformation and CNN-based analysis techniques spanning diverse applications, including brain-computer interfaces, seizure detection, motor control, sleep stage classification, arrhythmia detection, and more, are presented. Key insights are synthesised regarding the relative merits of different transformation approaches, CNN model architectures, training procedures, and benchmark performance. Current challenges and promising research directions at the intersection of deep learning and physiological signal processing are discussed. This review aims to catalyse continued innovations in effective end-to-end systems for clinically relevant information extraction from multidimensional physiological data using deep neural networks by providing a comprehensive overview of state-of-the-art techniques.
Cardiac disorders are one of the prime reasons for an increasing global death rate. Reliable and efficient diagnosis procedures are imperative to minimize the risk posed by heart disorders. ...Computer‐aided diagnosis, based on machine learning and biomedical signal analysis, has recently been adopted by researchers to accurately predict cardiac ailments. Multi‐channel Electrocardiogram signals are mostly used in scientific literature as an indicator to diagnose cardiac disorders. Recently pulse plethysmograph (PuPG) signal got attention as an evolving biosignal and promising diagnostic tool to detect heart disorders since it has a simple sensor with low cost, non‐invasive, reliable, and easy to handle technology. This article proposes a computer‐aided diagnosis system to detect Myocardial Infarction, Dilated Cardiomyopathy, and Hypertension from PuPG signals. Raw PuPG signal is first preprocessed through empirical mode decomposition (EMD) by removing the redundant and useless information content. Then, highly discriminative features are extracted from preprocessed PuPG signal through novel local spectral ternary patterns (LSTP). Extracted LSTPs are input to a variety of classification methods such as support vector machines (SVM), K‐nearest neighbours, decision tree, and so on. SVM with cubic kernel yielded the best classification performance of 98.4% accuracy, 96.7% sensitivity, and 99.6% specificity with 10‐fold cross‐validation. The proposed framework was trained and tested on a self‐collected PuPG signals database of heart disorders. A comparison with previous studies and other feature descriptors shows the superiority of the proposed system. This research provides better insights into the contributions of PuPG signals towards reliable detection of heart disorder through low‐cost and non‐invasive means.
► Prediction of chaotic time series using two hybridCI techniques is considered. ► Single multiplicative neuron (SMN) model is used in place of conventional ANN. ► SMN model parameters are estimated ...using cooperative particle swam optimization. ► Results are compared with adaptive neuro-fuzzy inference system (ANFIS). ► Both show good results but ANFIS performs better for all three benchmark datasets.
In this paper, two CI techniques, namely, single multiplicative neuron (SMN) model and adaptive neuro-fuzzy inference system (ANFIS), have been proposed for time series prediction. A variation of particle swarm optimization (PSO) with co-operative sub-swarms, called COPSO, has been used for estimation of SMN model parameters leading to COPSO-SMN. The prediction effectiveness of COPSO-SMN and ANFIS has been illustrated using commonly used nonlinear, non-stationary and chaotic benchmark datasets of Mackey–Glass, Box–Jenkins and biomedical signals of electroencephalogram (EEG). The training and test performances of both hybrid CI techniques have been compared for these datasets.
This paper presents the results from a one-year study of 12 patients with moderate dementia in an adult day program who played a novel whack-a-mole game-based measurement instrument for cognitive ...behavior and performance. The ongoing measurement of cognition and changes associated with dementia is a challenge for healthcare providers. Measurement methods based on a tablet-based instrument are proposed. Partnership with the adult day program greatly eased recruitment: all but 1 eligible participant joined our study, compared to one in five, or lower, for previous studies with similar populations. There are three unique aspects to the design of our game: first, it has two distinct targets requiring different actions, which increases the cognitive processing for the users; second, each level is systematically more difficult; third, it records and analyzes player performance. The results show that the patients' game performance improves over the first few weeks; this indicates that they are learning the game and retaining ability gains from week-to-week, suggesting some procedural learning is still intact. Over the year, 4 participants showed cognitive decline, 4 were stable and 3 improved based on their Minimental State Exam (MMSE) score. Two measures are proposed based on level progression within the sessions and mole hit performance. The level progression measure identifies declining participants with one false negative (FN) and one false positive error. The mole hit performance measure identifies declining participants with one FN error. These results demonstrate the potential for the proposed instrument to provide an ongoing measurement as an alternative for the repeated application of the MMSE.
Background: A muscle-computer interface is one of the new applications of the human-computer interface technologies and specifically the brain-computer interface. Brain-muscle-computer interface ...based on the Electromyography (EMG) signal. EMG signal is an electrical activity from a muscle that is used as an input for effecting several tasks.Objective: This work presented an interfacing process between the Graphical User Interface (GUI) and hardware system. Using the implemented system, the researcher shall deals with the raw EMG data easily by analyzing the signal from the muscle sensor detection.Material and Methods: A novel virtual EMG signal control and analysis system design is proposed in this work. The system consists mainly of two parts, hardware and software toolbox. Hardware design is mainly dependent on using a muscle movement sensor as well as the feedback from the virtual toolbox. The virtual software design offers a relatively simple design of a friendly graphical user interface. It consists mainly of the input EMG signal and output signal after using different processing methods. Feedback response from the final EMG signal results after the processing may help the designer to present the optimal hardware design.Results: The output results show the output performance of the proposed virtual EMG data controlling and analysis with the implemented hardware design of the muscle sensor movement detection. The results show promise that these interfaces may provide a new option to benefit the designer in choosing the optimal prosthesis design of severely disabled persons.
The Lempel-Ziv (LZ) complexity and its variants are popular metrics for characterizing biological signals. Proper interpretation of such analyses, however, has not been thoroughly addressed. In this ...letter, we study the the effect of finite data size. We derive analytic expressions for the LZ complexity for regular and random sequences, and employ them to develop a normalization scheme. To gain further understanding, we compare the LZ complexity with the correlation entropy from chaos theory in the context of epileptic seizure detection from EEG data, and discuss advantages of the normalized LZ complexity over the correlation entropy
The motivation of this research is to introduce the first research on automated Chronic Obstructive Pulmonary Disease (COPD) diagnosis using deep learning and the first annotated dataset in this ...field. The primary objective and contribution of this research is the development and design of an artificial intelligence system capable of diagnosing COPD utilizing only the heart signal (electrocardiogram, ECG) of the patient. In contrast to the traditional way of diagnosing COPD, which requires spirometer tests and a laborious workup in a hospital setting, the proposed system uses the classification capabilities of deep transfer learning and the patient's heart signal, which provides COPD signs in itself and can be received from any modern smart device. Since the disease progresses slowly and conceals itself until the final stage, hospital visits for diagnosis are uncommon. Hence, the medical goal of this research is to detect COPD using a simple heart signal before it becomes incurable. Deep transfer learning frameworks, which were previously trained on a general image data set, are transferred to carry out an automatic diagnosis of COPD by classifying patients' electrocardiogram signal equivalents, which are produced by signal-to-image transform techniques. Xception, VGG-19, InceptionResNetV2, DenseNet-121, and "trained-from-scratch" convolutional neural network architectures have been investigated for the detection of COPD, and it is demonstrated that they are able to obtain high performance rates in classifying nearly 33.000 instances using diverse training strategies. The highest classification rate was obtained by the Xception model at 99%. This research shows that the newly introduced COPD detection approach is effective, easily applicable, and eliminates the burden of considerable effort in a hospital. It could also be put into practice and serve as a diagnostic aid for chest disease experts by providing a deeper and faster interpretation of ECG signals. Using the knowledge gained while identifying COPD from ECG signals may aid in the early diagnosis of future diseases for which little data is currently available.
It is well known that cells in tissue display a large heterogeneity in gene expression due to differences in cell lineage origin and variation in the local environment. Traditional methods that ...analyze gene expression from bulk RNA extracts fail to accurately describe this heterogeneity because of their intrinsic limitation in cellular and spatial resolution. Also, information on histology in the form of tissue architecture and organization is lost in the process. Recently, new transcriptome-wide analysis technologies have enabled the study of RNA molecules directly in tissue samples, thus maintaining spatial resolution and complementing histological information with molecular information important for the understanding of many biological processes and potentially relevant for the clinical management of cancer patients. These new methods generally comprise three levels of analysis. At the first level, biochemical techniques are used to generate signals that can be imaged by different means of fluorescence microscopy. At the second level, images are subject to digital image processing and analysis in order to detect and identify the aforementioned signals. At the third level, the collected data are analyzed and transformed into interpretable information by statistical methods and visualization techniques relating them to each other, to spatial distribution, and to tissue morphology. In this review, we describe state-of-the-art techniques used at all three levels of analysis. Finally, we discuss future perspective in this fast-growing field of spatially resolved transcriptomics.
Typically data acquired through imaging techniques such as functional magnetic resonance imaging (fMRI), structural MRI (sMRI), and electroencephalography (EEG) are analyzed separately. However, ...fusing information from such complementary modalities promises to provide additional insight into connectivity across brain networks and changes due to disease. We propose a data fusion scheme at the feature level using canonical correlation analysis (CCA) to determine inter-subject covariations across modalities. As we show both with simulation results and application to real data, multimodal CCA (mCCA) proves to be a flexible and powerful method for discovering associations among various data types. We demonstrate the versatility of the method with application to two datasets, an fMRI and EEG, and an fMRI and sMRI dataset, both collected from patients diagnosed with schizophrenia and healthy controls. CCA results for fMRI and EEG data collected for an auditory oddball task reveal associations of the temporal and motor areas with the N2 and P3 peaks. For the application to fMRI and sMRI data collected for an auditory sensorimotor task, CCA results show an interesting joint relationship between fMRI and gray matter, with patients with schizophrenia showing more functional activity in motor areas and less activity in temporal areas associated with less gray matter as compared to healthy controls. Additionally, we compare our scheme with an independent component analysis based fusion method, joint-ICA that has proven useful for such a study and note that the two methods provide complementary perspectives on data fusion.
Accurate detection of characteristic electrocardiogram (ECG) waves is necessary for ECG analysis and interpretation. In this paper, we distinguish four processing steps of detection algorithms: noise ...and artifacts reduction, transformations, fiducial marks selection of wave candidates, and decision rule. Processing steps combinations from several detection algorithms are used to find QRS, P, and T wave peaks. In addition, we consider the search window parameter modification based on waveform templates extracted by heart cycles clustering. The methods are extensively evaluated on two public ECG databases containing QRS, P, and T wave peaks annotations. We found that the combination of morphological mathematical filtering with Elgendi's algorithm works best for QRS detection on MIT-BIH Arrhythmia Database (detection error rate (DER = 0.48%, Lead I). The combination of modified Martinez's PT and wavelet transform (WT) methods gave the best results for P wave peaks detection on both databases, when both leads are considered (MIT-BIH arrhythmia database: DER = 32.13%, Lead I, DER = 42.52%, Lead II; QT Database: DER = 21.23%, Lead I, DER = 26.80%, Lead II). Waveform templates in combination with Martinez's WT obtained the best results for T wave peaks detection on QT database (DER = 25.15%, Lead II). This paper demonstrates that combining some of the best proposed methods in literature leads to improvements over the original methods for ECG waves detection while maintaining satisfactory computation times.