To determine whether a time delay greater than 6h from injury to surgical debridement influences the infection rate in open fractures.
During a period of 18 months, from October 2010 to March 2012, ...151 open fractures were available for study in 142 patients in our hospital. The data were collected prospectively and the patients were followed up for 6 weeks. The patients were divided into two groups regarding the time delay from injury to surgical debridement (more or less than 6 hours).
Surgical debridement was carried out in less than 6h from injury in 90 (59.6%) fractures and after 6 hours from injury in 61 (40.4%) fractures. Infection rates were 12.22% and 13.24%, respectively. The global infection rate was 13.24%.
A significantly increased infection rate was not observed in patients whose surgical debridement occurred more than 6h after injury. However, in the fractures of high-energy trauma, a statistically significant increase of the rate of infection was observed in those operated 6 hours after trauma. Level of Evidence II, Study Type Comparative and Prospective.
Hematophagous insects act as the major reservoirs of infectious agents due to their intimate contact with a large variety of vertebrate hosts.
is the main vector of
in the New World, but its role as ...a host of viruses is poorly understood. In this work,
RNA libraries were subjected to progressive assembly using viral profile HMMs as seeds. A sequence phylogenetically related to fungal viruses of the genus
was identified and this novel virus was named Lul-MV-1. The 2697-base genome presents a single gene coding for an RNA-directed RNA polymerase with an organellar genetic code. To determine the possible host of Lul-MV-1, we analyzed the molecular characteristics of the viral genome. Dinucleotide composition and codon usage showed profiles similar to mitochondrial DNA of invertebrate hosts. Also, the virus-derived small RNA profile was consistent with the activation of the siRNA pathway, with size distribution and 5' base enrichment analogous to those observed in viruses of sand flies, reinforcing
as a putative host. Finally, RT-PCR of different insect pools and sequences of public
RNA libraries confirmed the high prevalence of Lul-MV-1. This is the first report of a mitovirus infecting an insect host.
Fomitiporella
has been phylogenetically studied in order to taxonomically treat it more naturally. In order to establish
Fomitiporella
s.s., studies on the holotype of
F. umbrinella
(the type species ...of
Fomitiporella
), and new specimens from the type locality, were carried out. Additionally, the holotype of
Fuscoporella coruscans
(the type species of
Fuscoporella
) was studied. A new morphological conceptual delimitation of
Fomitiporella
s.s. is presented, with a more restricted distribution pattern, and
Fuscoporella
is chosen as a synonym of
Fomitiporella
. Moreover,
Rajchenbergia,
gen. nov. is segregated from
Fomitiporella
s.l. based on morphology, phylogenetic relationships and host distribution. Taxonomic implications for the group and other related taxa are discussed.
Fomitiporella americana
is synonymized with
F
.
umbrinella
,
F
.
micropora
with
Fomitiporella coruscans
, comb. nov., and
F
.
melleopora
is now in
Tropicoporus melleoporus,
comb. nov., based on type and reference material studies.
This paper examines the impacts of three different potential evapotranspiration (PET) models on drought severity and frequencies indicated by the standardized precipitation index (SPEI). The ...standardized precipitation-evapotranspiration index is a recent approach to operational monitoring and analysis of drought severity. The standardized precipitation-evapotranspiration index combines precipitation and temperature data, quantifying the severity of a drought as the difference in a timestep as the difference between precipitation and PET. The standardized precipitation-evapotranspiration index thus represents the hydrological processes that drive drought events more realistically than the standardized precipitation index at the expense of additional computational complexity and increased data demands. The additional computational complexity is principally due to the need to estimate PET within each time step. The standardized precipitation-evapotranspiration index was originally defined using the Thornthwaite PET model. However, numerous researchers have demonstrated the standardized precipitation-evapotranspiration index is sensitive to the PET model adopted. PET models requiring sparse meteorological inputs, such as the Thornthwaite model, have particular utility for drought monitoring in data scarce environments. The aridity index (AI) investigates the spatiotemporal changes in the hydroclimatic system. It is defined as the ratio between potential evapotranspiration and precipitation. It is used to characterize wet (humid) and dry (arid) regions. In this study, a sensitivity analysis for the standardized precipitation-evapotranspiration and aridity indexes was carried out using three different PET models; namely, the Penman–Monteith model, a temperature-based parametric model and the Thornthwaite model. The analysis was undertaken in six gauge stations in California region where long-term drought events have occurred. Having used the Penman–Monteith model as the PET model for estimating the standardized precipitation-evapotranspiration index, our findings highlight the presence of uncertainty in defining the severity of drought, especially for large timescales (12 months to 48 months), and that the PET parametric model is a preferable model to the Thornthwaite model for both the standardized precipitation-evapotranspiration index and the aridity indexes. The latter outcome is worth further consideration for when climatic studies are under development in data scarce areas where full required meteorological variables for Penman–Monteith assessment are not available.
The karyotype is a strong independent prognostic factor in myelodysplastic syndromes (MDS). Since the implementation of the International Prognostic Scoring System (IPSS) in 1997, knowledge ...concerning the prognostic impact of abnormalities has increased substantially. The present study proposes a new and comprehensive cytogenetic scoring system based on an international data collection of 2,902 patients.
Patients were included from the German-Austrian MDS Study Group (n = 1,193), the International MDS Risk Analysis Workshop (n = 816), the Spanish Hematological Cytogenetics Working Group (n = 849), and the International Working Group on MDS Cytogenetics (n = 44) databases. Patients with primary MDS and oligoblastic acute myeloid leukemia (AML) after MDS treated with supportive care only were evaluated for overall survival (OS) and AML evolution. Internal validation by bootstrap analysis and external validation in an independent patient cohort were performed to confirm the results.
In total, 19 cytogenetic categories were defined, providing clear prognostic classification in 91% of all patients. The abnormalities were classified into five prognostic subgroups (P < .001): very good (median OS, 61 months; hazard ratio HR, 0.5; n = 81); good (49 months; HR, 1.0 reference category; n = 1,809); intermediate (26 months; HR, 1.6; n = 529); poor (16 months; HR, 2.6; n = 148); and very poor (6 months; HR, 4.2; n = 187). The internal and external validations confirmed the results of the score.
In conclusion, these data should contribute to the ongoing efforts to update the IPSS by refining the cytogenetic risk categories.
To compare low-dose decitabine to best supportive care (BSC) in higher-risk patients with myelodysplastic syndrome (MDS) age 60 years or older and ineligible for intensive chemotherapy.
Two-hundred ...thirty-three patients (median age, 70 years; range, 60 to 90 years) were enrolled; 53% had poor-risk cytogenetics, and the median MDS duration at random assignment was 3 months. Primary end point was overall survival (OS). Decitabine (15 mg/m(2)) was given intravenously over 4 hours three times a day for 3 days in 6-week cycles.
OS prolongation with decitabine versus BSC was not statistically significant (median OS, 10.1 v 8.5 months, respectively; hazard ratio HR, 0.88; 95% CI, 0.66 to 1.17; two-sided, log-rank P = .38). Progression-free survival (PFS), but not acute myeloid leukemia (AML) -free survival (AMLFS), was significantly prolonged with decitabine versus BSC (median PFS, 6.6 v 3.0 months, respectively; HR, 0.68; 95% CI, 0.52 to 0.88; P = .004; median AMLFS, 8.8 v 6.1 months, respectively; HR, 0.85; 95% CI, 0.64 to 1.12; P = .24). AML transformation was significantly (P = .036) reduced at 1 year (from 33% with BSC to 22% with decitabine). Multivariate analyses indicated that patients with short MDS duration had worse outcomes. Best responses with decitabine versus BSC, respectively, were as follows: complete response (13% v 0%), partial response (6% v 0%), hematologic improvement (15% v 2%), stable disease (14% v 22%), progressive disease (29% v 68%), hypoplasia (14% v 0%), and inevaluable (8% v 8%). Grade 3 to 4 febrile neutropenia occurred in 25% of patients on decitabine versus 7% of patients on BSC; grade 3 to 4 infections occurred in 57% and 52% of patients on decitabine and BSC, respectively. Decitabine treatment was associated with improvements in patient-reported quality-of-life (QOL) parameters.
Decitabine administered in 6-week cycles is active in older patients with higher-risk MDS, resulting in improvements of OS and AMLFS (nonsignificant), of PFS and AML transformation (significant), and of QOL. Short MDS duration was an independent adverse prognosticator.
Glucose and fructose are the main fermentable sugars in cocoa pulp. During fermentation, glucose is consumed within 48–72 h and fructose only after 120 h, mainly associated with the preferential use ...of glucose by microorganisms. In the first stage of this study, the complete genome sequence of a lactic acid bacterium with high fructose consumption capacity (Lactobacillus plantarum LPBF35) was reported. The notable genomic features of L. plantarum LPBF35 were the presence of alcohol/acetaldehyde dehydrogenase gene and improved PTS system, confirming its classification as a “facultatively” fructophilic bacterium. Subsequently, this bacterium was introduced into cocoa fermentation process in single and mixed cultures with Pediococcus acidilactici LPBF66 or Pichia fermentans YC5.2. Community composition by Illumina-based amplicon sequencing and viable counts indicated suppression of wild microflora in all treatments. At the beginning of the fermentation processes, cocoa pulp consisted of approximately 73.09 mg/g glucose and 73.64 mg/g fructose. The L. plantarum LPBF35 + P. fermentans YC5.2 process showed the lowest levels of residual sugars after 72 h of fermentation (7.89 and 4.23 mg/g, for fructose and glucose, respectively), followed by L. plantarum LPBF35 + Ped. acidilactici LPBF66 (8.85 and 6.42 mg/g, for fructose and glucose, respectively), single L. plantarum LPBF35 treatment (4.15 and 10.15 mg/g, for fructose and glucose, respectively), and spontaneous process (22.25 and 14.60 mg/g, for fructose and glucose, respectively). The positive interaction between L. plantarum LPBF35 and P. fermentans YC5.2 resulted in an improved formation of primary (ethanol, lactic acid, and acetic acid) and secondary (2-methyl-1-butanol, isoamyl acetate, and ethyl acetate) metabolites during fermentation. The primary metabolites accumulated significantly in cocoa beans fermented by P. fermentans YC5.2 + L. plantarum LPBF35, causing important reactions of color development and key flavor molecules formation. The results of this study suggest that fructophilic lactic acid bacteria and yeast is a microbial consortium that could improve sugar metabolism and aroma formation during cocoa beans fermentation.
•Genome properties classified L. plantarum LPBF35 as a “facultatively” fructophilic bacterium.•Co-cultivation with P. fermentans YC5.2 enhanced sugar metabolism during cocoa fermentation.•Cocoa beans with greater aroma composition were produced using co-culturing treatment.
•Novel resilience framework to support infrastructure recovery prioritisation in war-torn countries.•Resilience by assessment based on standoff observations from disparate data ...sources.•Reconstruction prioritisation using cost-based resilience for the benefit of the society.
Apart from security issues, war-torn societies and countries face immense challenges in rebuilding damaged critical infrastructure. Existing post-conflict recovery frameworks mainly focus on social impacts and mitigation. Also, existing frameworks for resilience to natural hazards are mainly based on design and intervention, yet, they are not fit for post-conflict infrastructure recovery for a number of reasons explained in this paper. Post-conflict peacebuilding can be enhanced when resilience by assessment (RBA) is employed, using standoff observations that include data from disparate remote-sensing sources, e.g. public satellite imagery, forensics and crowdsourcing, collected during the conflict. This paper discusses why conflicts and warfare require a new framework for achieving post-conflict infrastructure resilience. It then introduces a novel post-conflict framework that includes different scales of resilience with a focus on asset and regional resilience. It considers different levels of knowledge, with a focus on standoff observations and data-driven assessments to facilitate prioritisation during reconstruction. The framework is then applied to the transport network of the area west of Kyiv, Ukraine to demonstrate how resilience by assessment can support decision-makers, such as governments and multilateral financial institutions, to address infrastructure needs and accelerate financial and humanitarian assistance, absorb shocks and maximise infrastructure recovery after conflict.