Uncontrolled hemorrhage is closely related to the high risk of death. However, local hemostats still have various defects and side effects. Herein, an aldehyde dextran (PDA) sponge with proper ...absorption and adhesion properties is developed for hemorrhage control. PDA sponge with pore size of ∼30–50 μm fabricated by lyophilization not only absorbs blood quickly (47.7 g/g), but also possesses strong tissue adhesion (∼100 kPa). PDA sponge with low cytotoxicity and hemolysis achieves effective hemostasis and remarkable blood loss reduction in the ear vein, femoral artery and liver injuries of rabbit models. Furthermore, the exploration of hemostatic mechanisms related to tissue, blood, plasma, cells and coagulation system indicates that PDA sponge can significantly accelerate coagulation by rapid wound block, fast cells aggregation and initiation, and high coagulation factors concentration, instead of by the coagulation cascade activation. Importantly, this hemostat exhibits excellent biodegradability and nearly no skin irritation. Overall, the biodegradable and tissue adhesive PDA sponge will be a promising quick-hemostatic dressing for uncontrollable hemorrhage.
Display omitted
Here, we present a joint-tissue imputation (JTI) approach and a Mendelian randomization framework for causal inference, MR-JTI. JTI borrows information across transcriptomes of different tissues, ...leveraging shared genetic regulation, to improve prediction performance in a tissue-dependent manner. Notably, JTI includes the single-tissue imputation method PrediXcan as a special case and outperforms other single-tissue approaches (the Bayesian sparse linear mixed model and Dirichlet process regression). MR-JTI models variant-level heterogeneity (primarily due to horizontal pleiotropy, addressing a major challenge of transcriptome-wide association study interpretation) and performs causal inference with type I error control. We make explicit the connection between the genetic architecture of gene expression and of complex traits and the suitability of Mendelian randomization as a causal inference strategy for transcriptome-wide association studies. We provide a resource of imputation models generated from GTEx and PsychENCODE panels. Analysis of biobanks and meta-analysis data, and extensive simulations show substantially improved statistical power, replication and causal mapping rate for JTI relative to existing approaches.
Objectives The aim of this study was to examine the relation of galectin-3 (Gal-3), a marker of cardiac fibrosis, with incident heart failure (HF) in the community. Background Gal-3 is an emerging ...prognostic biomarker in HF, and experimental studies suggest that Gal-3 is an important mediator of cardiac fibrosis. Whether elevated Gal-3 concentrations precede the development of HF is unknown. Methods Gal-3 concentrations were measured in 3,353 participants in the Framingham Offspring Cohort (mean age 59 years; 53% women). The relation of Gal-3 to incident HF was assessed using proportional hazards regression. Results Gal-3 was associated with increased left ventricular mass in age-adjusted and sex-adjusted analyses (p = 0.001); this association was attenuated in multivariate analyses (p = 0.06). A total of 166 participants developed incident HF and 468 died during a mean follow-up period of 11.2 years. Gal-3 was associated with risk for incident HF (hazard ratio HR: 1.28 per 1 SD increase in log Gal-3; 95% confidence interval CI: 1.14 to 1.43; p < 0.0001) and remained significant after adjustment for clinical variables and B-type natriuretic peptide (HR: 1.23; 95% CI: 1.04 to 1.47; p = 0.02). Gal-3 was also associated with risk for all-cause mortality (multivariable-adjusted HR: 1.15; 95% CI: 1.04 to 1.28; p = 0.01). The addition of Gal-3 to clinical factors resulted in negligible changes to the C-statistic and minor improvements in net reclassification improvement. Conclusions Higher concentration of Gal-3, a marker of cardiac fibrosis, is associated with increased risk for incident HF and mortality. Future studies evaluating the role of Gal-3 in cardiac remodeling may provide further insights into the role of Gal-3 in the pathophysiology of HF.
The origin of agricultural products is crucial to their quality and safety. This study explored the differences in chemical composition and structure of rice from different origins using fluorescence ...detection technology. These differences are mainly affected by climate, environment, geology and other factors. By identifying the fluorescence characteristic absorption peaks of the same rice seed varieties from different origins, and comparing them with known or standard samples, this study aims to authenticate rice, protect brands, and achieve traceability. The study selected the same variety of rice seed planted in different regions of Jilin Province in the same year as samples. Fluorescence spectroscopy was used to collect spectral data, which was preprocessed by normalization, smoothing, and wavelet transformation to remove noise, scattering, and burrs. The processed spectral data was used as input for the long short-term memory (LSTM) model. The study focused on the processing and analysis of rice spectra based on NZ-WT-processed data. To simplify the model, uninformative variable elimination (UVE) and successive projections algorithm (SPA) were used to screen the best wavelengths. These wavelengths were used as input for the support vector machine (SVM) prediction model to achieve efficient and accurate predictions. Within the fluorescence spectral range of 475-525 nm and 665-690 nm, absorption peaks of nicotinamide adenine dinucleotide (NADPH), riboflavin (B2), starch, and protein were observed. The origin tracing prediction model established using SVM exhibited stable performance with a classification accuracy of up to 99.5%.The experiment demonstrated that fluorescence spectroscopy technology has high discrimination accuracy in tracing the origin of rice, providing a new method for rapid identification of rice origin.
Neural decoding is still a challenging and a hot topic in neurocomputing science. Recently, many studies have shown that brain network patterns containing rich spatiotemporal structural information ...represent the brain's activation information under external stimuli. In the traditional method, brain network features are directly obtained using the standard machine learning method and provide to a classifier, subsequently decoding external stimuli. However, this method cannot effectively extract the multidimensional structural information hidden in the brain network. Furthermore, studies on tensors have show that the tensor decomposition model can fully mine unique spatiotemporal structural characteristics of a spatiotemporal structure in data with a multidimensional structure. This research proposed a stimulus-constrained Tensor Brain Network (s-TBN) model that involves the tensor decomposition and stimulus category-constraint information. The model was verified on real neuroimaging data obtained via magnetoencephalograph and functional mangetic resonance imaging). Experimental results show that the s-TBN model achieve accuracy matrices of greater than 11.06% and 18.46% on the accuracy matrix compared with other methods on two modal datasets. These results prove the superiority of extracting discriminative characteristics using the STN model, especially for decoding object stimuli with semantic information.
In this paper, a thin-core fiber-based in-line Mach-Zehnder interferometer is theoretically and experimentally demonstrated, and a high sensitivity consistency of temperature (>0.995) is obtained ...owing to the formed single modal interference. Furthermore, the nonlinear wavelength-dependence of elastic-optical coefficient is characterized, and a sharp intensity inversion between the macro- and micro-bending states is experimentally observed. Then, a novel dual-differential compensation (DDC) method is proposed to eliminate the temperature errors in the measured curvature and strain. By means of DDC, the variations of curvature and strain can be simultaneously detected and discriminated without the crosstalk of ambient temperature, and the corrected sensitivities, respectively, reach -17.67 nm/m -1 and -1.92 pm/με. This fabricated sensor is very practical and promising for the accurate measurement of multiple parameters.
Nighttime light remote sensing has unique advantages on reflecting human activities, and thus has been used in many fields including estimating population and GDP, analyzing light pollution and ...monitoring disasters and conflict. However, the existing nighttime light remote sensors have many limitations because they are subject to one or more shortcomings such as coarse spatial resolution, restricted swath width and lack of multi-spectral data. Therefore, we propose an optical system of imaging spectrometer based on linear variable filter. The imaging principle, optical specifications, optical design, imaging performance analysis and tolerance analysis are illustrated. The optical system with a focal length of 100 mm, F-number 4 and 43° field of view in the spectrum range of 400–1000 nm is presented, and excellent image quality is achieved. The system can obtain the multi-spectral images of eight bands with a spatial resolution of 21.5 m and a swath width of 320 km at the altitude of 500 km. Compared with the existing nighttime light remote sensors, our system possesses the advantages of high spatial and high spectral resolution, wide spectrum band and wide swath width simultaneously, greatly making up for the shortage of the present systems. The result of tolerance analysis shows our system satisfy the requirements of fabrication and alignment.
Ten years ago it was widely expected that the genetic basis of common disease would be resolved by genome-wide association studies (GWAS), large-scale studies in which the entire genome is covered by ...genetic markers. However, the bulk of heritable variance remains unexplained. The authors consider several alternative research strategies. For instance, whereas it has been hypothesized that a common disease is associated primarily with common genetic variants, it is now plausible that multiple rare variants each have a potent effect on disease risk and that they could accumulate to become a substantial component of common disease risk. This idea has become more appealing since the discovery that copy number variants (CNVs) are a substantial source of human mutations and are associated with multiple common diseases. CNVs are structural genomic variants consisting of microinsertions, microdeletions, and transpositions in the human genome. It has been argued that numerous rare CNVs are plausible causes of a substantial proportion of common disease, and rare CNVs have been found to be potent risk factors in schizophrenia and autism. Another approach is to "parse the genome," i.e., reanalyze subsets of current GWAS data, since the noise inherent in genome-wide approaches may be hiding valid associations. Lastly, technological advances and declining costs may allow large-scale genome-wide sequencing that would comprehensively identify all genetic variants. Study groups even larger than the 10,000 subjects in current meta-analyses would be required, but the outcomes may lead to resolution of our current dilemma in common diseases: Where is the missing heritability?
The expression microarray is a frequently used approach to study gene expression on a genome-wide scale. However, the data produced by the thousands of microarray studies published annually are ...confounded by "batch effects," the systematic error introduced when samples are processed in multiple batches. Although batch effects can be reduced by careful experimental design, they cannot be eliminated unless the whole study is done in a single batch. A number of programs are now available to adjust microarray data for batch effects prior to analysis. We systematically evaluated six of these programs using multiple measures of precision, accuracy and overall performance. ComBat, an Empirical Bayes method, outperformed the other five programs by most metrics. We also showed that it is essential to standardize expression data at the probe level when testing for correlation of expression profiles, due to a sizeable probe effect in microarray data that can inflate the correlation among replicates and unrelated samples.
Affected by the hardware conditions and environment of imaging, images generally have serious noise. The presence of noise diminishes the image quality and compromises its effectiveness in real-world ...applications. Therefore, in real-world applications, reducing image noise and improving image quality are essential. Although current denoising algorithms can somewhat reduce noise, the process of noise removal may result in the loss of intricate details and adversely impact the overall image quality. Hence, to enhance the effectiveness of image denoising while preserving the intricate details of the image, this article presents a multi-scale feature learning convolutional neural network denoising algorithm (MSFLNet), which consists of three feature learning (FL) modules, a reconstruction generation module (RG), and a residual connection. The three FL modules help the algorithm learn the feature information of the image and improve the efficiency of denoising. The residual connection moves the shallow information that the model has learned to the deep layer, and RG helps the algorithm in image reconstruction and creation. Finally, our research indicates that our denoising method is effective.