Data-driven based fault diagnosis methods play an increasingly important role in rotating machinery. Among these methods, deep learning is widely concerned due to its strong nonlinear feature ...learning ability, wherein ResNet is a very powerful model. However, the diagnostic performance of the model depends on sufficient labeled data samples, which is extremely difficult to achieve under actual complex working conditions. In this paper, a multidimensional normalized ResNet is proposed for fault diagnosis of cross-working conditions under limited label samples. Firstly, the collected vibration data under different conditions are preprocessed by computed order tracking to reduce the distribution differences. Secondly, batch normalization and group normalization are fused together to enhance the feature extraction ability of ResNet. Moreover, the rectified linear unit is replaced by the gaussian error linear unit to improve the robustness of the trained model. Then, the model is trained by source domain samples, and the obtained parameters will be transferred to the target domain. Finally, the transferred model is fine-tuned through limited samples in the target domain and applied for cross conditions fault diagnosis. Two kinds of datasets are analyzed by the proposed method and existing models to demonstrate its superiority.
The popularity of digital histopathology is growing rapidly in the development of computer aided disease diagnosis systems. However, the color variations due to manual cell sectioning and stain ...concentration make the process challenging in various digital pathological image analysis such as histopathological image segmentation and classification. Hence, the normalization of these variations are needed to obtain the promising results. The proposed research intends to introduce a reliable and robust new complete color normalization method, addressing the problems of color and stain variability. The new complete color normalization involves three phases, namely enhanced fuzzy illuminant normalization, fuzzy-based stain normalization, and modified spectral normalization. The extensive simulations are performed and validated on histopathological images. The presented algorithm outperforms the existing conventional normalization methods by overcoming the certain limitations and challenges. As per the experimental quality metrics and comparative analysis, the proposed algorithm performs efficiently and provides promising results.
Graph Neural Networks (GNNs) have emerged as a useful paradigm to process graph-structured data. Usually, GNNs are stacked to multiple layers and node representations in each layer are computed ...through propagating and aggregating the neighboring node features. To effectively train a GNN with multiple layers, normalization techniques are necessary. Though existing normalization techniques have achieved good results in helping GNNs training, but they seldom consider the structure information of the graph. In this paper, we propose two graph-aware normalization techniques, namely adjacency-wise normalization and graph-wise normalization, which fully take into account the structure information of the graph. Furthermore, we propose a novel approach, termed Attentive Graph Normalization (AGN), which learns a weighted combination of multiple graph-aware normalization methods, aiming to automatically select the optimal combination of multiple normalization methods for a specific task. We conduct extensive experiments on eleven benchmark datasets, including three single-graph and eight multiple-graph datasets, and the experimental results provide a comprehensive evaluation on the effectiveness of our proposals.
•For Gabor, gradient, HOG and wavelet features the number of intensity levels minimally affects the classification.•Normalization method and number of intensity levels influence the classification ...for GLCM and GRLM features.•For MRI and CT images, the lowest classification error is usually obtained with a large number of intensity levels.•For MRI and CT images, the recommended normalization method is 1%–99%.•For images with additive noise and GLCM or GRLM features, ±3σ normalization and low number of grey levels is recommended.
Image texture is a very important component in many types of images, including medical images. Medical images are often corrupted by noise and affected by artifacts. Some of the texture-based features that should describe the structure of the tissue under examination may also reflect, for example, the uneven sensitivity of the scanner within the tissue region. This in turn may lead to an inappropriate description of the tissue or incorrect classification. To limit these phenomena, the analyzed regions of interest are normalized. In texture analysis methods, image intensity normalization is usually followed by a reduction in the number of levels coding the intensity. The aim of this work was to analyze the impact of different image normalization methods and the number of intensity levels on texture classification, taking into account noise and artifacts related to uneven background brightness distribution. Analyses were performed on four sets of images: modified Brodatz textures, kidney images obtained by means of dynamic contrast-enhanced magnetic resonance imaging, shoulder images acquired as T2-weighted magnetic resonance images and CT heart and thorax images. The results will be of use for choosing a particular method of image normalization, based on the types of noise and distortion present in the images.
•Target-based linear and vector normalization techniques are developed.•Three aggregation methods based on the normalization techniques are addressed.•A double normalization-based multiple ...aggregation (DNMA) method is proposed.•The DNMA method is implemented to solve two case studies.
This paper develops a comprehensive algorithm for multi-expert multi-criteria decision making problems considering quantitative and qualitative criteria in forms of benefit, cost or target types. We focus on using probabilistic linguistic term sets to express the qualitative evaluations due to their excellence in expressing complex individual and collective linguistic assessments. Firstly, we develop a target-based linear normalization technique and a target-based vector normalization technique. A weight adjustment method is proposed to achieve the tradeoff between criteria after normalization. Given that the two target-based normalization techniques have different advantages, we then propose a ranking method, which consists three subordinate models, based on these two target-based normalization approaches and three aggregation techniques. Reliable results of a multi-expert multi-criteria decision making problem are determined by integrating the subordinate utility values and the ranks of alternatives. The proposed method is implemented to solve the green enterprise ranking problems and the excavation scheme selection problem for shallow buried tunnels, respectively. The advantages of the proposed method are emphasized through comparative analyses with other ranking methods.
Person search involves localizing and re-identifying persons of interest captured by multiple, non-overlapping cameras. Recent approaches to person search are typically built on object detection ...frameworks to learn joint person representations for detection and re-identification. To this end, the features extracted from pedestrian proposals are projected onto a unit hypersphere using L2 normalization, and positive proposals that sufficiently overlap with the ground truth are equally incorporated for training by exploiting an external lookup table (LUT). We have found that (1) using the L2 normalization technique, without considering feature distributions, can degenerate the discriminative power of person representations, (2) positive proposals often depict distracting details, such as background clutter and person overlaps, and (3) person features in the LUT are not often updated during training. To address these limitations, we propose a novel framework for person search, dubbed PLoPS, using a prototypical normalization layer, ProtoNorm, that calibrates features while considering the long-tail distribution across person IDs. PLoPS also entails a localization-aware learning scheme that prioritizes better-aligned proposals w.r.t the ground truth. We further introduce a LUT calibration technique to continuously adjust the person features in the LUT. Experimental results and analysis on standard benchmarks demonstrate the effectiveness of PLoPS.
•A novel normalization layer, ProtoNorm, considering the class imbalance problem across person IDs.•A localization-aware objective function for learning discriminative features for person search.•A LUT calibration technique that adjusts person features in the LUT during training.•We set a new state of the art on standard benchmarks for person search.
Airborne trace elements (TEs) present in atmospheric fine particulate matter (PM2.5) exert notable threats to human health and ecosystems. To explore the impact of meteorological conditions on ...shaping the pollution characteristics of TEs and the associated health risks, we quantified the variations in pollution characteristics and health risks of TEs due to meteorological impacts using weather normalization and health risk assessment models, and analyzed the source-specific contributions and potential sources of primary TEs affecting health risks using source apportionment approaches at four sites in Shandong Province from September to December 2021. Our results indicated that TEs experience dual effects from meteorological conditions, with a tendency towards higher TE concentrations and related health risks during polluted period, while the opposite occurred during clean period. The total non-carcinogenic and carcinogenic risks of TEs during polluted period increased approximately by factors of 0.53–1.74 and 0.44–1.92, respectively. Selenium (Se), manganese (Mn), and lead (Pb) were found to be the most meteorologically influenced TEs, while chromium (Cr) and manganese (Mn) were identified as the dominant TEs posing health risks. Enhanced emissions of multiple sources for Cr and Mn were found during polluted period. Depending on specific wind speeds, industrialized and urbanized centers, as well as nearby road dusts, could be key sources for TEs. This study suggested that attentions should be paid to not only the TEs from primary emissions but also the meteorology impact on TEs especially during pollution episodes to reduce health risks in the future.
Display omitted
•Concentrations and health risks of TEs suffered dual effects of meteorological conditions.•Se, Mn, and Pb were influenced by meteorology dramatically.•Cr and Mn were dominant TEs affecting health risks.•HQ and CR of TEs increased by factors of 0.53–1.74 and 0.44–1.92, respectively, during polluted period.•Road dusts, industrialized and urbanized centers are pivotal sources for Cr and Mn.