This paper examined the relationship of constructs identified by identity continuity and attachment theories of grief to adjustment to loss from the framework of identity theory. Connections between ...loss salience, identity construal, and balance of identity construal on grief intensity via their association with post-loss identity disruption were examined across three types of self-relevant losses; death of a family member, job loss, or divorce. We hypothesized that lower salience, higher endorsement of identity attributes represented by relational and individualistic self-construals, and an overall balance across identity attributes would be related to decreased grief severity across all types of loss. Results supported hypotheses with the exception that the hypothesized ameliorative effect of increases in relational self-construal was only seen in the bereaved group.
Contact sports participation has been shown to have both beneficial and detrimental effects on health, however little is known about the metabolic sequelae of these effects. We aimed to identify ...metabolite alterations across a collegiate American football season. Serum was collected from 23 male collegiate football athletes before the athletic season (Pre) and after the last game (Post). Samples underwent nontargeted metabolomic profiling and 1131 metabolites were included for univariate, pathway enrichment, and multivariate analyses. Significant metabolites were assessed against head acceleration events (HAEs). 200 metabolites changed from Pre to Post (P < 0.05 and Q < 0.05); 160 had known identity and mapped to one of 57 pre-defined biological pathways. There was significant enrichment of metabolites belonging to five pathways (P < 0.05): xanthine, fatty acid (acyl choline), medium chain fatty acid, primary bile acid, and glycolysis, gluconeogenesis, and pyruvate metabolism. A set of 12 metabolites was sufficient to discriminate Pre from Post status, and changes in 64 of the 200 metabolites were also associated with HAEs (P < 0.05). In summary, the identified metabolites, and candidate pathways, argue there are metabolic consequences of both physical training and head impacts with football participation. These findings additionally identify a potential set of objective biomarkers of repetitive head injury.
We introduce here a novel machine learning (ML) framework to address the issue of the quantitative assessment of the immune content in neuroblastoma (NB) specimens. First, the EUNet, a U-Net with an ...EfficientNet encoder, is trained to detect lymphocytes on tissue digital slides stained with the CD3 T-cell marker. The training set consists of 3782 images extracted from an original collection of 54 whole slide images (WSIs), manually annotated for a total of 73,751 lymphocytes. Resampling strategies, data augmentation, and transfer learning approaches are adopted to warrant reproducibility and to reduce the risk of overfitting and selection bias. Topological data analysis (TDA) is then used to define activation maps from different layers of the neural network at different stages of the training process, described by persistence diagrams (PD) and Betti curves. TDA is further integrated with the uniform manifold approximation and projection (UMAP) dimensionality reduction and the hierarchical density-based spatial clustering of applications with noise (HDBSCAN) algorithm for clustering, by the deep features, the relevant subgroups and structures, across different levels of the neural network. Finally, the recent TwoNN approach is leveraged to study the variation of the intrinsic dimensionality of the U-Net model. As the main task, the proposed pipeline is employed to evaluate the density of lymphocytes over the whole tissue area of the WSIs. The model achieves good results with mean absolute error 3.1 on test set, showing significant agreement between densities estimated by our EUNet model and by trained pathologists, thus indicating the potentialities of a promising new strategy in the quantification of the immune content in NB specimens. Moreover, the UMAP algorithm unveiled interesting patterns compatible with pathological characteristics, also highlighting novel insights into the dynamics of the intrinsic dataset dimensionality at different stages of the training process. All the experiments were run on the Microsoft Azure cloud platform.
This paper proposes a new method to evaluate geomagnetic activity based on wavelet analysis during the solar minimum activity (2007). In order to accomplish this task, a newly developed algorithm ...called effectiveness wavelet coefficient (EWC) was applied. Furthermore, a comparison between the 5 geomagnetically quiet days determined by the Kp-based method and by wavelet-based method was performed. This paper provides a new insight since the geomagnetic activity indexes are mostly designed to quantify the extent of disturbance rather than the quietness. The results suggest that the EWC can be used as an alternative tool to accurately detect quiet days, and consequently, it can also be used as an alternative to determine the Sq baseline to the current Kp-based 5 quietest days method. Another important aspect of this paper is that most of the quietest local wavelet candidate days occurred in an interval 2 days prior to the high-speed-stream-driven storm events. In other words, the EWC algorithm may potentially be used to detect the quietest magnetic activity that tends to occur just before the arrival of high-speed-stream-driven storms.
Background Hypertensive disorders of pregnancy are the leading causes of both maternal morbidity and maternal mortality. Hypertensive disorders are acute obstetric emergencies, which refer to various ...life-threatening medical challenges known to develop during pregnancy, labor, and delivery, requiring urgent attention to reduce blood pressure (BP) for the benefit of the affected mothers and infants. Hydralazine and labetalol have been widely used as the first-line medications in the management of severe hypertension during pregnancy. However, the choice between these two drugs lacks clear evidence regarding their safety and superiority. Several studies have attempted to study intravenous (IV) labetalol versus hydralazine, but very few such comparison studies have been conducted in Africa. Objective To compare the effectiveness of IV labetalol and IV hydralazine in reducing systolic and diastolic BP in pregnant women with severe hypertension. Also, to determine the time required for hydralazine and labetalol to lower BP to ≤150/100 mmHg, the number of doses needed for each drug, and evaluating maternal and perinatal outcomes. Study design This study employed an open-label randomized clinical trial design conducted in the labor, delivery, and antenatal ward of the Central and Stella Obasanjo Hospital in Benin City. A total of 120 women with severe pregnancy-induced hypertension were randomly assigned to two groups: Group X, consisting of 60 pregnant women, received IV hydralazine at a slow rate of 5 mg for five minutes, repeated every 20 minutes (maximum of five doses) until a blood pressure of ≤150/100 mmHg was achieved. Group Y, also consisting of 60 pregnant women, received IV labetalol in escalating doses of 25, 50, 75, 75, and 75 mg (maximum of 300 mg) every 20 minutes until the blood pressure reached ≤150/100 mmHg. Statistical analysis was performed using SPSS version 23 (IBM Inc., Armonk, New York). Result IV hydralazine achieved the target BP in an average time of 45.80 +/- 25.17 minutes, while IV labetalol took an average of 72.67 +/- 41.80 minutes (p=0.001). The number of doses required to reach the target BP differed significantly between the two drugs. Hydralazine required an average of 1.72 +/- 0.904 doses, whereas labetalol required an average of 3.72 +/- 1.782 doses (p=0.0001). While 45% of women in the hydralazine group attained the target BP with a single dose of hydralazine, only 31.1% of women in the labetalol group were able to attain the target BP with a single dose of labetalol (p=0.02). Overall, target BP was achieved in 55 out of 60 women (91.7%) who were randomized to receive IV hydralazine, whereas 45 out of 60 women (75%) who received IV labetalol achieved the target blood pressure. While hydralazine demonstrated more favorable results in terms of achieving target blood pressure, there were higher incidences of maternal adverse effects in the hydralazine group compared to the labetalol group. However, these adverse effects were not severe enough to warrant discontinuation of the medication. Conclusion IV hydralazine showed faster achievement of the target BP and a lower number of doses required compared to IV labetalol. Additionally, a higher percentage of women in the hydralazine group achieved the target BP with a single dose. However, there were more maternal adverse effects associated with hydralazine, although they were not severe. Perinatal outcomes did not differ significantly between the two groups.
Perioperative optimization of cardiac surgical patients is imperative to reduce complications, utilize health care resources efficiently, and improve patient recovery and quality of life. ...Standardized application of evidence-based best practices can lead to better outcomes. Although many practices should be applied universally to all patients, there are also opportunities along the surgical journey to identify patients who will benefit from additional interventions that will further ameliorate their recovery. Enhanced recovery programs aim to bundle several process elements in a standardized fashion to optimize outcomes after cardiac surgery. A foundational concept of enhanced recovery is attaining a better postsurgical end point for patients, in less time, through achievement and maintenance in their greatest possible physiologic, functional, and psychological state. Perioperative optimization is a broad topic, spanning multiple phases of care and involving a variety of medical specialties and nonphysician health care providers. In this review we highlight a variety of perioperative care topics, in which a comprehensive approach to patient care can lead to improved results for patients, providers, and the health care system. A particular focus on patient-centred care is included. Although existing evidence supports all of the elements reviewed, most require further improvements in implementation, as well as additional research, before their full potential and usefulness can be determined.
L’optimisation périopératoire de l’état des patients subissant une chirurgie cardiaque est nécessaire pour diminuer le risque de complications, utiliser efficacement les ressources de soins de santé, et améliorer le rétablissement des patients ainsi que leur qualité de vie. L’application normalisée de pratiques exemplaires fondées selon des données probantes pourrait entraîner des résultats plus favorables. Même si plusieurs mesures devraient idéalement être appliquées de façon universelle à tous les patients, il est également possible tout au long de leur parcours, de cibler ceux qui bénéficieraient d’interventions supplémentaires permettant d’aider davantage à leur rétablissement. Les programmes mis en œuvre à cet effet après une intervention chirurgicale visent à regrouper plusieurs éléments au sein d’une approche normalisée de façon à optimiser les résultats. L’un des concepts de base pour améliorer le rétablissement consiste à accélérer l’obtention des meilleurs résultats possibles après la chirurgie, et ce, par l’atteinte et le maintien d’un état physiologique, fonctionnel et psychologique optimal. L’optimisation périopératoire est un vaste sujet qui regroupe de multiples phases de soins et qui nécessite le concours de nombreux médecins spécialistes et fournisseurs de soins de santé autres que les médecins. Cet article met en lumière divers aspects des soins périopératoires pour lesquels l’adoption d’une approche exhaustive pourrait se traduire par une amélioration des résultats pour les patients, les professionnels de la santé et le système de soins de santé, et s’attarde en particulier aux soins axés sur les patients. Enfin, bien que les données disponibles à l’heure actuelle viennent appuyer les divers aspects passés en revue, la plupart d’entre eux bénéficieraient d’une amélioration de leur mise en pratique, ainsi que d’autres études pour caractériser toute l’étendue de leurs possibilités de même que leur utilité réelle.
Endocrine disrupting chemicals (EDCs) are xenobiotics that mimic the interaction of natural hormones and alter synthesis, transport, or metabolic pathways. The prospect of EDCs causing adverse health ...effects in humans and wildlife has led to the development of scientific and regulatory approaches for evaluating bioactivity. This need is being addressed using high-throughput screening (HTS)
approaches and computational modeling.
In support of the Endocrine Disruptor Screening Program, the U.S. Environmental Protection Agency (EPA) led two worldwide consortiums to virtually screen chemicals for their potential estrogenic and androgenic activities. Here, we describe the Collaborative Modeling Project for Androgen Receptor Activity (CoMPARA) efforts, which follows the steps of the Collaborative Estrogen Receptor Activity Prediction Project (CERAPP).
The CoMPARA list of screened chemicals built on CERAPP's list of 32,464 chemicals to include additional chemicals of interest, as well as simulated ToxCast™ metabolites, totaling 55,450 chemical structures. Computational toxicology scientists from 25 international groups contributed 91 predictive models for binding, agonist, and antagonist activity predictions. Models were underpinned by a common training set of 1,746 chemicals compiled from a combined data set of 11 ToxCast™/Tox21 HTS
assays.
The resulting models were evaluated using curated literature data extracted from different sources. To overcome the limitations of single-model approaches, CoMPARA predictions were combined into consensus models that provided averaged predictive accuracy of approximately 80% for the evaluation set.
The strengths and limitations of the consensus predictions were discussed with example chemicals; then, the models were implemented into the free and open-source OPERA application to enable screening of new chemicals with a defined applicability domain and accuracy assessment. This implementation was used to screen the entire EPA DSSTox database of
chemicals, and their predicted AR activities have been made available on the EPA CompTox Chemicals dashboard and National Toxicology Program's Integrated Chemical Environment. https://doi.org/10.1289/EHP5580.
Metagenomic next-generation sequencing (mNGS) is an untargeted technique for determination of microbial DNA/RNA sequences in a variety of sample types from patients with infectious syndromes. mNGS is ...still in its early stages of broader translation into clinical applications. To further support the development, implementation, optimization and standardization of mNGS procedures for virus diagnostics, the European Society for Clinical Virology (ESCV) Network on Next-Generation Sequencing (ENNGS) has been established. The aim of ENNGS is to bring together professionals involved in mNGS for viral diagnostics to share methodologies and experiences, and to develop application guidelines. Following the ENNGS publication Recommendations for the introduction of mNGS in clinical virology, part I: wet lab procedure in this journal, the current manuscript aims to provide practical recommendations for the bioinformatic analysis of mNGS data and reporting of results to clinicians.
Metagenomic sequencing is increasingly being used in clinical settings for difficult to diagnose cases. The performance of viral metagenomic protocols relies to a large extent on the bioinformatic ...analysis. In this study, the European Society for Clinical Virology (ESCV) Network on NGS (ENNGS) initiated a benchmark of metagenomic pipelines currently used in clinical virological laboratories.
Metagenomic datasets from 13 clinical samples from patients with encephalitis or viral respiratory infections characterized by PCR were selected. The datasets were analyzed with 13 different pipelines currently used in virological diagnostic laboratories of participating ENNGS members. The pipelines and classification tools were: Centrifuge, DAMIAN, DIAMOND, DNASTAR, FEVIR, Genome Detective, Jovian, MetaMIC, metaMix, One Codex, RIEMS, VirMet, and Taxonomer. Performance, characteristics, clinical use, and user-friendliness of these pipelines were analyzed.
Overall, viral pathogens with high loads were detected by all the evaluated metagenomic pipelines. In contrast, lower abundance pathogens and mixed infections were only detected by 3/13 pipelines, namely DNASTAR, FEVIR, and metaMix. Overall sensitivity ranged from 80% (10/13) to 100% (13/13 datasets). Overall positive predictive value ranged from 71-100%. The majority of the pipelines classified sequences based on nucleotide similarity (8/13), only a minority used amino acid similarity, and 6 of the 13 pipelines assembled sequences de novo. No clear differences in performance were detected that correlated with these classification approaches. Read counts of target viruses varied between the pipelines over a range of 2-3 log, indicating differences in limit of detection.
A wide variety of viral metagenomic pipelines is currently used in the participating clinical diagnostic laboratories. Detection of low abundant viral pathogens and mixed infections remains a challenge, implicating the need for standardization and validation of metagenomic analysis for clinical diagnostic use. Future studies should address the selective effects due to the choice of different reference viral databases.
Ultra-processed foods (UPF), as proposed by the Nova food classification system, are linked to the development of obesity and several non-communicable chronic diseases and deaths from all causes. The ...Nova-UPF screener developed in Brazil is a simple and quick tool to assess and monitor the consumption of these food products. The aim of this study was to adapt and validate, against the 24-hour dietary recall, this short food-based screener to assess UPF consumption in the Senegalese context.
The tool adaptation was undertaken using DELPHI methodology with national experts and data from a food market survey. Following the adaptation, sub-categories were renamed, restructured and new ones introduced. The validation study was conducted in the urban area of Dakar in a convenience sample of 301 adults, using as a reference the dietary share of UPF on the day prior to the survey, expressed as a percentage of total energy intake obtained via 24-hour recall. Association between the Nova-UPF score and the dietary share of UPF was evaluated using linear regression models. The Pabak index was used to assess the agreement in participants' classification according to quintiles of Nova-UPF score and quintiles of the dietary share of UPF.
The results show a linear and positive association (p-value < 0.001) between intervals of the Nova-UPF score and the average dietary share of UPF. There was a near perfect agreement in the distribution of individuals according to score's quintiles and UPF dietary share quintiles (Pabak index = 0.84).
The study concluded that the score provided by the Nova-UPF screener adapted to the Senegalese context is a valid estimate of UPF consumption.