With the increasing population of Industry 4.0, industrial big data (IBD) has become a hotly discussed topic in digital and intelligent industry field. The security problem existing in the signal ...processing on large scale of data stream is still a challenge issue in industrial internet of things, especially when dealing with the high-dimensional anomaly detection for intelligent industrial application. In this article, to mitigate the inconsistency between dimensionality reduction and feature retention in imbalanced IBD, we propose a variational long short-term memory (VLSTM) learning model for intelligent anomaly detection based on reconstructed feature representation. An encoder-decoder neural network associated with a variational reparameterization scheme is designed to learn the low-dimensional feature representation from high-dimensional raw data. Three loss functions are defined and quantified to constrain the reconstructed hidden variable into a more explicit and meaningful form. A lightweight estimation network is then fed with the refined feature representation to identify anomalies in IBD. Experiments using a public IBD dataset named UNSW-NB15 demonstrate that the proposed VLSTM model can efficiently cope with imbalance and high-dimensional issues, and significantly improve the accuracy and reduce the false rate in anomaly detection for IBD according to F1, area under curve (AUC), and false alarm rate (FAR).
Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be ...reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.
Osteoporosis is a progressive skeletal disease characterized by decreased bone mass and degraded bone microstructure, which leads to increased bone fragility and risks of bone fracture. Osteoporosis ...is generally age related and has become a major disease of the world. Uncovering the molecular mechanisms underlying osteoporosis and developing effective prevention and therapy methods has great significance for human health. Mesenchymal stem cells (MSCs) are multipotent cells capable of differentiating into osteoblasts, adipocytes, or chondrocytes, and have become the favorite source of cell-based therapy. Evidence shows that during osteoporosis, a shift of the cell differentiation of MSCs to adipocytes rather than osteoblasts partly contributes to osteoporosis. Thus, uncovering the molecular mechanisms of the osteoblast or adipocyte differentiation of MSCs will provide more understanding of MSCs and perhaps new methods of osteoporosis treatment. The MSCs have been applied to both preclinical and clinical studies in osteoporosis treatment. Here, we review the recent advances in understanding the molecular mechanisms regulating osteoblast differentiation and adipocyte differentiation of MSCs and highlight the therapeutic application studies of MSCs in osteoporosis treatment. This will provide researchers with new insights into the development and treatment of osteoporosis.
Modern Internet-of-Things (IoT) applications are heavily data driven and often require reliable data streams to achieve high-quality data mining. The concept of edge computing is introduced to reduce ...data latency and communication bandwidth between the cloud server and IoT edge devices. However, inefficient routing that may cause transmission failure or unnecessary data (re)transmission is still a key obstacle to obtain good and reliable data mining results. In this article, network coding combined with opportunistic routing is used to improve energy efficiency in wireless IoT infrastructure, considering the existence of link correlation. Studies have shown that packet receptions on wireless links are correlated, which is completely contrary to the assumption of link independence used in existing routing mechanisms. This assumption causes estimation errors in the calculation of expected number of transmissions for forwarders, which further affects the selection of forwarder set, and ultimately affects the performance of the protocol. We propose an intrasession network coding mechanism based on the mining of link correlation. A novel smart routing method is proposed to accurately estimate the number of transmissions required by forwarders, together with an algorithm for selecting a forwarder set with more optimal number of transmissions. Simulation results demonstrate that the proposed mechanism can achieve fewer transmissions and offer more energy-efficient communications for wireless edge IoT applications.
Recently, in this article, due to pervasive usability, smartphone-based human activity recognition (HAR) has witnessed significant development in smart health. Meanwhile, deep recurrent neural ...network (DRNN) shows a strong ability to automatically extract features from the time-series data, and therefore, DRNN-based HAR schemes have achieved more effective recognition (i.e., recognition accuracy) than those adopting the traditional machine learning. However, the efficiency in training and recognition (in terms of running time) has not fully taken into account, especially for resource-constraint smartphones. To solve the above issue, we propose the PSDRNN and tri-PSDRNN schemes that employ the explicit feature extraction before DRNN. Specifically, considering that the power spectral density (PSD) feature can capture the frequency characteristics and meanwhile retain the successive time characteristics of data gathered from smartphone accelerometer, PSD feature vectors are, respectively, extracted from linear accelerations and triaxle accelerations and explicitly used as the input to the following DRNN classification model. Thorough experiments based on a real dataset demonstrate that the PSDRNN can achieve the comparable effectiveness as the xyz -DRNN (the most accurate DRNN-based HAR scheme only using acceleration data), and the average recognition and training time were reduced by 56% and 80%, respectively. Moreover, tri-PSDRNN advantages over the xyz -DRNN in terms of recognition accuracy, and the running time is still lower than the xyz -DRNN. Besides, our proposed PSDRNN scheme achieved superiority in the recognition of complex transition activities.
Statistical image reconstruction (SIR) methods have shown potential to substantially improve the image quality of low‐dose x‐ray computed tomography (CT) as compared to the conventional filtered ...back‐projection (FBP) method. According to the maximum a posteriori (MAP) estimation, the SIR methods are typically formulated by an objective function consisting of two terms: (a) a data‐fidelity term that models imaging geometry and physical detection processes in projection data acquisition, and (b) a regularization term that reflects prior knowledge or expectations of the characteristics of the to‐be‐reconstructed image. SIR desires accurate system modeling of data acquisition, while the regularization term also has a strong influence on the quality of reconstructed images. A variety of regularization strategies have been proposed for SIR in the past decades, based on different assumptions, models, and prior knowledge. In this paper, we review the conceptual and mathematical bases of these regularization strategies and briefly illustrate their efficacies in SIR of low‐dose CT.
Low‐dose X‐ray computed tomography (LDCT) imaging is highly recommended for use in the clinic because of growing concerns over excessive radiation exposure. However, the CT images reconstructed by ...the conventional filtered back‐projection (FBP) method from low‐dose acquisitions may be severely degraded with noise and streak artifacts due to excessive X‐ray quantum noise, or with view‐aliasing artifacts due to insufficient angular sampling. In 2005, the nonlocal means (NLM) algorithm was introduced as a non‐iterative edge‐preserving filter to denoise natural images corrupted by additive Gaussian noise, and showed superior performance. It has since been adapted and applied to many other image types and various inverse problems. This paper specifically reviews the applications of the NLM algorithm in LDCT image processing and reconstruction, and explicitly demonstrates its improving effects on the reconstructed CT image quality from low‐dose acquisitions. The effectiveness of these applications on LDCT and their relative performance are described in detail.
Sparse-view CT reconstruction algorithms via total variation (TV) optimize the data iteratively on the basis of a noise- and artifact-reducing model, resulting in significant radiation dose reduction ...while maintaining image quality. However, the piecewise constant assumption of TV minimization often leads to the appearance of noticeable patchy artifacts in reconstructed images. To obviate this drawback, we present a penalized weighted least-squares (PWLS) scheme to retain the image quality by incorporating the new concept of total generalized variation (TGV) regularization. We refer to the proposed scheme as 'PWLS-TGV' for simplicity. Specifically, TGV regularization utilizes higher order derivatives of the objective image, and the weighted least-squares term considers data-dependent variance estimation, which fully contribute to improving the image quality with sparse-view projection measurement. Subsequently, an alternating optimization algorithm was adopted to minimize the associative objective function. To evaluate the PWLS-TGV method, both qualitative and quantitative studies were conducted by using digital and physical phantoms. Experimental results show that the present PWLS-TGV method can achieve images with several noticeable gains over the original TV-based method in terms of accuracy and resolution properties.
The characterization of heavy metal pollution is urgently needed in modern environmental studies. However, traditional geochemical methods for detecting soil heavy metals are rather time-consuming ...and expensive. In recent years, non-destructive and rapid magnetic techniques seem promising in monitoring soil pollution but it is questionable how this relates to heavy metal concentrations. Therefore, in order to understand the correlation of heavy metal pollution with environmental magnetism, magnetic susceptibility (χLF) and the concentrations of As, Cd, Cr, Cu, Ni, Pb and Zn were measured in the topsoil (0–15cm) collected from Kaifeng City, China. In this study, the spatial distribution of heavy metals and χLF, as well as the correlation between pollution load index (PLI) and χLF were carried out. Results show that the contamination factor (CF) values of different heavy metals follow the order: Cd (10.48)>Zn (2.28)>Pb (1.68)>Cu (1.51)>Ni (0.81)>Cr (0.80)>As (0.65). The average pollution load index (PLI) of the metals is 2.53, representing a moderate pollution level as a whole of the city soil. In general, similar spatial distribution patterns of heavy metals and χLF were found in this research, which decrease progressively from the southeast/east to the northwest/west in the study area. High concentrations of heavy metals and high levels of χLF appears around the southeast, the north of the older city (within the ancient city wall), and along the Longxi–Haizhou Railway. Moreover, contents of As, Cd, Cr, Cu, Ni, Pb and Zn in soils and PLI are significantly positively correlated with their χLF. The results further attest that the measurement of χLF is a simple, rapid and quantitative method for the assessment of heavy metal contamination of soils. When χLF≤71×10−8m3kg−1, the soil is considered to be non-polluted; 71×10−8<χLF≤162×10−8m3kg−1 represents slightly polluted soils; 162×10−8<χLF≤253×10−8m3kg−1 moderate pollution, and χLF≥253×10−8m3kg−1 corresponds to heavily polluted soils. However, the standard is exclusively valid for the study area and cannot be simply “transferred” to other polluted areas.
•Similar spatial distribution patterns of heavy metals and χLF were found.•As, Cd, Cr, Cu, Ni, Pb, Zn and PLI showed a significant correlation with χLF.•The heavy metals and χLF in soil of Kaifeng were mainly produced by anthropogenic factors.•The concentrations of heavy metals can be predicted from χLF simply and quantitatively.
It is difficult to achieve high efficiency production of hydrophobic graphene by liquid phase exfoliation due to its poor dispersibility and the tendency of graphene sheets to undergo π-π stacking. ...Here, we report a water-phase, non-dispersion exfoliation method to produce highly crystalline graphene flakes, which can be stored in the form of a concentrated slurry (50 mg mL
) or filter cake for months without the risk of re-stacking. The as-exfoliated graphene slurry can be directly used for 3D printing, as well as fabricating conductive graphene aerogels and graphene-polymer composites, thus avoiding the use of copious quantities of organic solvents and lowering the manufacturing cost. This non-dispersion strategy paves the way for the cost-effective and environmentally friendly production of graphene-based materials.