Sparse adaptive channel estimation problem is one of the most important topics in broadband wireless communications systems due to its simplicity and robustness. So far many sparsity-aware channel ...estimation algorithms have been developed based on the well-known minimum mean square error (MMSE) criterion, such as the zero-attracting least mean square (ZALMS),which are robust under Gaussian assumption. In non-Gaussian environments, however, these methods are often no longer robust especially when systems are disturbed by random impulsive noises. To address this problem, we propose in this work a robust sparse adaptive filtering algorithm using correntropy induced metric (CIM) penalized maximum correntropy criterion (MCC) rather than conventional MMSE criterion for robust channel estimation. Specifically, MCC is utilized to mitigate the impulsive noise while CIM is adopted to exploit the channel sparsity efficiently. Both theoretical analysis and computer simulations are provided to corroborate the proposed methods.
Over the past centuries, millions of bridge infrastructures have been constructed globally. Many of those bridges are ageing and exhibit significant potential risks. Frequent risk-based inspection ...and maintenance management of highway bridges is particularly essential for public safety. At present, most bridges rely on manual inspection methods for management. The efficiency is extremely low, causing the risk of bridge deterioration and defects to increase day by day, reducing the load-bearing capacity of bridges, and restricting the normal and safe use of them. At present, the applications of digital twins in the construction industry have gained significant momentum and the industry has gradually entered the information age. In order to obtain and share relevant information, engineers and decision makers have adopted digital twins over the entire life cycle of a project, but their applications are still limited to data sharing and visualization. This study has further demonstrated the unprecedented applications of digital twins to sustainability and vulnerability assessments, which can enable the next generation risk-based inspection and maintenance framework. This study adopts the data obtained from a constructor of Zhongcheng Village Bridge in Zhejiang Province, China as a case study. The applications of digital twins to bridge model establishment, information collection and sharing, data processing, inspection and maintenance planning have been highlighted. Then, the integration of “digital twins (or Building Information Modelling, BIM) + bridge risk inspection model” has been established, which will become a more effective information platform for all stakeholders to mitigate risks and uncertainties of exposure to extreme weather conditions over the entire life cycle.
As the most significant carbon isotope excursion in the past ∼10 Ma, the Late Miocene Carbon Isotope Shift (LMCIS, ∼7.65 to 6.5 Ma) offers a great opportunity to investigate the carbon-climate ...dynamics in a warmer-than-today world. However, the driving mechanisms of the LMCIS remain controversial. In this study, we used a 7-box biogeochemical model to simulate the long-term sea water δ13C and atmospheric CO2 changes during the late Miocene. Based on quantitative parameterization of two terrestrial processes (C4-grasses expansion and enhanced weathering input) during the late Miocene, our results show that the synergy between the two terrestrial processes may ultimately result in the LMCIS via the perturbation of the land-sea carbon fluxes. Moreover, our results reveal that the re-partitioning of alkalinity and nutrients between the land and the ocean may have influenced the long-term atmospheric CO2 change during the late Miocene.
•Late Miocene C4-grass expansion can cause a limited decrease of sea water δ13C.•Enhanced weathering input in the late Miocene contributed to the δ13C shift too.•The synergy of the two terrestrial forcings could be the major driver of the LMCIS.•Late Miocene pCO2 decline was linked to land-ocean carbon redistribution.
In this study, we examine the effects of firms' corporate social responsibility (CSR), technological innovation, and advertising intensity on corporate financial performance (CFP). Prior research has ...shown mixed findings for the CSR–CFP relationship. To provide additional evidence and alternative explanations for these mixed findings, we built a moderated mediating model by combining the knowledge-based view with the stakeholder theory. We use this model to examine whether CSR influences CFP by affecting technological innovation, and whether such mediating effects are moderated by advertising intensity. We classify heterogeneous CSR activities into technical and institutional activities. Using data from 2010 to 2018 on Chinese listed firms, we find that superior technical CSR performance can enhance CFP by promoting technological innovation and that it promotes technological innovation to a greater extent when advertising intensity is higher. However, institutional CSR does not affect technological innovation or CFP. The findings suggest that to improve the firm's financial position, its resources should be allocated effectively to technical CSR activities as well as to innovation and advertising.
It is likely that an RNA world existed in early life, when RNA played both the roles of the genome and functional molecules, thereby undergoing Darwinian evolution. However, even with only one type ...of polymer, it seems quite necessary to introduce a labour division concerning these two roles because folding is required for functional molecules (ribozymes) but unfavourable for the genome (as a template in replication). Notably, while ribozymes tend to have adopted a linear form for folding without constraints, a circular form, which might have been topologically hindered in folding, seems more suitable for an RNA template. Another advantage of involving a circular genome could have been to resist RNA's end-degradation. Here, we explore the scenario of a circular RNA genome plus linear ribozyme(s) at the precellular stage of the RNA world through computer modelling. The results suggest that a one-gene scene could have been 'maintained', albeit with rather a low efficiency for the circular genome to produce the ribozyme, which required precise chain-break or chain-synthesis. This strict requirement may have been relieved by introducing a 'noncoding' sequence into the genome, which had the potential to derive a second gene through mutation. A two-gene scene may have 'run well' with the two corresponding ribozymes promoting the replication of the circular genome from different respects. Circular genomes with more genes might have arisen later in RNA-based protocells. Therefore, circular genomes, which are common in the modern living world, may have had their 'root' at the very beginning of life.It is likely that an RNA world existed in early life, when RNA played both the roles of the genome and functional molecules, thereby undergoing Darwinian evolution. However, even with only one type of polymer, it seems quite necessary to introduce a labour division concerning these two roles because folding is required for functional molecules (ribozymes) but unfavourable for the genome (as a template in replication). Notably, while ribozymes tend to have adopted a linear form for folding without constraints, a circular form, which might have been topologically hindered in folding, seems more suitable for an RNA template. Another advantage of involving a circular genome could have been to resist RNA's end-degradation. Here, we explore the scenario of a circular RNA genome plus linear ribozyme(s) at the precellular stage of the RNA world through computer modelling. The results suggest that a one-gene scene could have been 'maintained', albeit with rather a low efficiency for the circular genome to produce the ribozyme, which required precise chain-break or chain-synthesis. This strict requirement may have been relieved by introducing a 'noncoding' sequence into the genome, which had the potential to derive a second gene through mutation. A two-gene scene may have 'run well' with the two corresponding ribozymes promoting the replication of the circular genome from different respects. Circular genomes with more genes might have arisen later in RNA-based protocells. Therefore, circular genomes, which are common in the modern living world, may have had their 'root' at the very beginning of life.
The intestinal tract of vertebrates is normally colonized with a remarkable number of commensal microorganisms that are collectively referred to as gut microbiota. Gut microbiota has been ...demonstrated to interact with immune cells and to modulate specific signaling pathways involving both innate and adaptive immune processes. Accumulated evidence suggests that the imbalance of Th17 and Treg cells is associated with the development of many diseases. Herein, we emphatically present recent findings to show how specific gut microbiota organisms and metabolites shape the balance of Th17 and Treg cells. We also discuss the therapeutic potential of fecal microbiota transplantation (FMT) in diseases caused by the imbalance of Th17 and Treg cells.
Mycoplasmal pneumonia in sheep and goats usually result covert but huge economic losses in the sheep and goat industry. The disease is prevalent in various countries in Africa and Asia. Clinical ...manifestations in affected animals include anorexia, fever, and respiratory symptoms such as dyspnea, polypnea, cough, and nasal discharge. Due to similarities with other respiratory infections, accurate diagnosis can be challenging, and isolating the causative organism is often problematic. However, the utilization of molecular techniques, such as PCR, allows for rapid and specific identification of pathogens. Thus, a goat infection model with Mycoplasma was established and the pathogen was tested using PCR. The results indicated that this approach could be effectively utilized for the rapid detection of mycoplasma in clinical settings. Additionally, the prevalence of contagious pleuropneumonia of sheep in Qinghai Province was further investigated through PCR analysis. A total of 340 nasal swabs were collected from 17 sheep farms in Qinghai province. Among these samples, 84 tested positive for Mycoplasma mycoides subsp. capri (Mmc) and 148 tested positive for Mycoplasma ovipneumoniae (Movi), resulting in positive rates of 24.71% and 43.53% respectively. Furthermore, our investigation revealed positive PCR results for nasal swabs, trachea, and lung samples obtained from sheep exhibiting symptoms suggestive of mycoplasma infection. Moreover, three distinct strains were isolated from these positive samples. Additionally, the inflammatory cytokines of peripheral blood mononuclear cells (PBMCs) were assessed using RT-PCR. The findings demonstrated a high susceptibility of sheep to Movi in Qinghai province, with infected sheep displaying an inflammatory response. Consequently, the outcomes of this study will furnish valuable epidemiological insights for the effective prevention and control of this disease within Qinghai Province.
The origin of life involved complicated evolutionary processes. Computer modeling is a promising way to reveal relevant mechanisms. However, due to the limitation of our knowledge on prebiotic ...chemistry, it is usually difficult to justify parameter-setting for the modeling. Thus, typically, the studies were conducted in a reverse way: the parameter-space was explored to find those parameter values "supporting" a hypothetical scene (that is, leaving the parameter-justification a later job when sufficient knowledge is available). Exploring the parameter-space manually is an arduous job (especially when the modeling becomes complicated) and additionally, difficult to characterize as regular "Methods" in a paper. Here we show that a machine-learning-like approach may be adopted, automatically optimizing the parameters. With this efficient parameter-exploring approach, the evolutionary modeling on the origin of life would become much more powerful. In particular, based on this, it is expected that more near-reality (complex) models could be introduced, and thereby theoretical research would be more tightly associated with experimental investigation in this field-hopefully leading to significant steps forward in respect to our understanding on the origin of life.
Detection of the four tobacco shred varieties and the subsequent unbroken tobacco shred rate are the primary tasks in cigarette inspection lines. It is especially critical to identify both single and ...overlapped tobacco shreds at one time, that is, fast blended tobacco shred detection based on multiple targets. However, it is difficult to classify tiny single tobacco shreds with complex morphological characteristics, not to mention classifying tobacco shreds with 24 types of overlap, posing significant difficulties for machine vision-based blended tobacco shred multi-object detection and unbroken tobacco shred rate calculation tasks. This study focuses on the two challenges of identifying blended tobacco shreds and calculating the unbroken tobacco shred rate. In this paper, a new multi-object detection model is developed for blended tobacco shred images based on an improved YOLOv7-tiny model. YOLOv7-tiny is used as the multi-object detection network’s mainframe. A lightweight Resnet19 is used as the model backbone. The original SPPCSPC and coupled detection head are replaced with a new spatial pyramid SPPFCSPC and a decoupled joint detection head, respectively. An algorithm for two-dimensional size calculation of blended tobacco shreds (LWC) is also proposed, which is applied to blended tobacco shred object detection images to obtain independent tobacco shred objects and calculate the unbroken tobacco shred rate. The experimental results showed that the final detection precision, mAP@.5, mAP@.5:.95, and testing time were 0.883, 0.932, 0.795, and 4.12 ms, respectively. The average length and width detection accuracy of the blended tobacco shred samples were −1.7% and 13.2%, respectively. The model achieved high multi-object detection accuracy and 2D size calculation accuracy, which also conformed to the manual inspection process in the field. This study provides a new efficient implementation method for multi-object detection and size calculation of blended tobacco shreds in cigarette quality inspection lines and a new approach for other similar blended image multi-object detection tasks.
In recent years, more and more attention has been paid to wind energy throughout the world as a kind of clean and renewable energy. Due to doubts concerning wind power and the influence of natural ...factors such as weather, unpredictability, and the risk of system operation increase, wind power seems less reliable than traditional power generation. An accurate and reliable prediction of wind power would enable a power dispatching department to appropriately adjust the scheduling plan in advance according to the changes in wind power, ensure the power quality, reduce the standby capacity of the system, reduce the operation cost of the power system, reduce the adverse impact of wind power generation on the power grid, and improve the power system stability as well as generation adequacy. The traditional back propagation (BP) neural network requires a manual setting of a large number of parameters, and the extreme learning machine (ELM) algorithm simplifies the time complexity and does not need a manual setting of parameters, but the loss function in ELM based on second-order statistics is not the best solution when dealing with nonlinear and non-Gaussian data. For the above problems, this paper proposes a novel wind power prediction method based on ELM with kernel mean p-power error loss, which can achieve lower prediction error compared with the traditional BP neural network. In addition, to reduce the computational problems caused by the large amount of data, principal component analysis (PCA) was adopted to eliminate some redundant data components, and finally the efficiency was improved without any loss in accuracy. Experiments using the real data were performed to verify the performance of the proposed method.