This study aimed to evaluate rib fracture rate as well as rib fracture characteristics after thoracic trauma in patients with normal versus diminished bone mineral density (BMD). A retrospective ...cohort study of persons aged 50 years or older presenting to the Emergency Department after sustaining blunt thoracic trauma between July 1, 2014, and December 31, 2017, was performed. Patient and trauma characteristics and DXA scan results were collected. Rib fracture rate and characteristics were evaluated on a radiograph and/or CT scan of the thorax. In total, 119 patients were included for analysis. Fifty-eight of them (49%) had a diminished BMD. In the remaining 61, the BMD was normal. The diminished BMD group experienced rib fractures more often than the normal BMD group (
n
= 43 (74%) versus
n
= 31 (51%);
p
= 0.014). Patients with diminished BMD suffered low-energy trauma more frequently than the normal BMD group (21 (36%) versus 11 patients (15%), respectively (
p
= 0.011)). Rib fracture characteristics such as the median number of rib fractures, concomitant intrathoracic injury rate, and rib fracture type distribution were not different between the groups. The rate of rib fractures after blunt thoracic trauma was significantly higher in patients with diminished BMD than in patients with a normal BMD. Differences in number and location of rib fractures between groups could not be proven. When assessing patients aged 50 years or older presenting to the hospital after substantial blunt thoracic trauma, the presence of diminished BMD should be taken into account and the presence of rib fractures should be investigated with appropriate diagnostic procedures. Diminished bone mineral density (i.e., osteopenia or osteoporosis) is associated with increased fracture risk. This study evaluated if diminished BMD increases the rib fracture risk. Patients with diminished BMD have a higher risk of sustaining rib fractures after substantial blunt thoracic trauma, which implicates a lower threshold for CT imaging of the chest.
White matter hyperintensities (WMHs) are a common manifestation of cerebral small vessel disease, that is increasingly studied with large, pooled multicenter datasets. This data pooling increases ...statistical power, but poses challenges for automated WMH segmentation. Although there is extensive literature on the evaluation of automated WMH segmentation methods, such evaluations in a multicenter setting are lacking. We performed WMH segmentations in sixty patients scanned on six different magnetic resonance imaging (MRI) scanners (10 patients per scanner) using five freely available and fully-automated WMH segmentation methods (Cascade, kNN-TTP, Lesion-TOADS, LST-LGA and LST-LPA). Different MRI scanner vendors and field strengths were included. We compared these automated WMH segmentations with manual WMH segmentations as a reference. Performance of each method both within and across scanners was assessed using spatial and volumetric correspondence with the reference segmentations by Dice's similarity coefficient (DSC) and intra-class correlation coefficient (ICC) respectively. We found the best performance, both within and across scanners, for kNN-TTP, followed by LST-LPA and LST-LGA, with worse performance for Lesion-TOADS and Cascade. Our findings can serve as a guide for choosing a method and also highlight the importance to further improve and evaluate consistency of methods in a multicenter setting.
Virus‐specific T cells can recognize allogeneic HLA (allo‐HLA) through TCR cross‐reactivity. The allospecificity often differs by individual (private cross‐reactivity) but also can be shared by ...multiple individuals (public cross‐reactivity); however, only a few examples of the latter have been described. Because these could facilitate alloreactivity prediction in transplantation, we aimed to identify novel public cross‐reactivities of human virus‐specific CD8+ T cells directed against allo‐HLA by assessing their reactivity in mixed‐lymphocyte reactions. Further characterization was done by studying TCR usage with primer‐based DNA sequencing, cytokine production with ELISAs, and cytotoxicity with 51chromium‐release assays. We identified three novel public allo‐HLA cross‐reactivities of human virus‐specific CD8+ T cells. CMV B35/IPS CD8+ T cells cross‐reacted with HLA‐B51 and/or HLA‐B58/B57 (23% of tetramer‐positive individuals), FLU A2/GIL (influenza IMP58‐66 HLA‐A*02:01/GILGFVFTL) CD8+ T cells with HLA‐B38 (90% of tetramer‐positive individuals), and VZV A2/ALW (varicella zoster virus IE62593‐601 HLA‐A*02:01/ALWALPHAA) CD8+ T cells with HLA‐B55 (two unrelated individuals). Cross‐reactivity was tested against different cell types including endothelial and epithelial cells. All cross‐reactive T cells expressed a memory phenotype, emphasizing the importance for transplantation. We conclude that public allo‐HLA cross‐reactivity of virus‐specific memory T cells is not uncommon and may create novel opportunities for alloreactivity prediction and risk estimation in transplantation.
Screening for alloreactivity of virus‐specific memory T cells unmasks several public virus‐specific T cell receptors that cross‐react with the same allogeneic HLA in multiple unrelated individuals.
Abstract Background Geriatric assessment is increasingly used to assess the health status of older cancer patients. We set out to assemble all available evidence on the relevance of a geriatric ...assessment in the treatment of older patients with haematological malignancies. Methods A systematic Medline and Embase search for studies in which a geriatric assessment was used to detect health issues or to address the association between baseline geriatric assessment and outcome. Results 18 publications from 15 studies were included. The median age of patients was 73 years (range 58–86). Despite generally good performance status, the prevalence of geriatric impairments was high. Geriatric impairments were associated with a shorter overall survival in a relevant proportion of studies (instrumental activities 55%, nutritional status 67%, cognitive capacities 83%, objectively measured physical capacity 100%). Comorbidity, physical capacity and nutritional status retained their significance even in multivariate analyses in 50%, 75%, and 67% of analyses respectively, whereas age and performance status lost their predictive value in most studies. One study found an association between comorbidity and chemotherapy-related non-haematological toxicity. In another study a pronounced association between summarised outcome of geriatric assessment and chemotherapy-related toxicity as well as response to treatment was described. Conclusion This review demonstrates that a geriatric assessment can detect multiple health issues, even in patients with good performance status. Impairments in geriatric domains have predictive value for mortality and also appear to be associated with toxicity and other outcome measures and should thus be integrated in individualised treatment algorithms.
Guidelines for developing and implementing stewardship programmes include recommendations on appropriate antibiotic use to guide the stewardship team’s choice of potential stewardship objectives. ...They also include recommendations on behavioural change interventions to guide the team’s choice of potential interventions to ensure that professionals actually use antibiotics appropriately in daily practice.
To summarize the evidence base of both appropriate antibiotic use recommendations (the ‘what’) and behavioural change interventions (the ‘how’) in hospital practice.
Published systematic reviews/Medline.
The literature shows low-quality evidence of the positive effects of appropriate antibiotic use in hospital patients. The literature shows that any behavioural change intervention might work to ensure that professionals actually perform appropriate antibiotic use recommendations in daily practice. Although effects were overall positive, there were large differences in improvement between studies that tested similar change interventions.
The literature showed a clear need for studies that apply appropriate study designs– (randomized) controlled designs—to test the effectiveness of appropriate antibiotic use on achieving meaningful outcomes. Most current studies used designs prone to confounding by indication. In the process of selecting behavioural change interventions that might work best in a chosen setting, much should be learned from behavioural sciences. The challenge for stewardship teams lies in selecting change interventions on the careful assessment of barriers and facilitators, and on a theoretical base while linking determinants to change interventions. Future studies should apply more robust designs and evaluations when assessing behavioural change interventions.
To curb increasing resistance rates, responsible antimicrobial use (AMU) is needed, both in human and veterinary medicine. In human healthcare, antimicrobial stewardship programmes (ASPs) have been ...implemented worldwide to improve appropriate AMU. No ASPs have been developed for and implemented in companion animal clinics yet.
The objective of the present study was to implement and evaluate the effectiveness of an ASP in 44 Dutch companion animal clinics. The objectives of the ASP were to increase awareness on AMU, to decrease total AMU whenever possible and to shift AMU towards 1st choice antimicrobials, according to Dutch guidelines on veterinary AMU.
The study was designed as a prospective, stepped-wedge, intervention study, which was performed from March 2016 until March 2018. The multifaceted intervention was developed using previous qualitative and quantitative research on current prescribing behaviour in Dutch companion animal clinics. The number of Defined Daily Doses for Animal (DDDAs) per clinic (total, 1st, 2nd and 3rd choice AMU) was used to quantify systemic AMU. Monthly AMU data were described using a mixed effect time series model with auto-regression. The effect of the ASP was modelled using a step function and a change in the (linear) time trend.
A statistically significant decrease of 15% (7%-22%) in total AMU, 15% (5%-24%) in 1st choice AMU and 26% (17%-34%) in 2nd choice AMU was attributed to participation in the ASP, on top of the already ongoing time trends. Use of 3rd choice AMs did not significantly decrease by participation in the ASP. The change in total AMU became more prominent over time, with a 16% (4%-26%) decrease in (linear) time trend per year.
This study shows that, although AMU in Dutch companion animal clinics was already decreasing and changing, AMU could be further optimised by participation in an antimicrobial stewardship programme.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
•Dolutegravir (DTG) and darunavir/ritonavir (DRV/r) are potent and favourable options for treatment of human immunodeficiency virus infection.•The glucuronidation metabolic ratio of DTG increased ...when co-administered with DRV/r, probably due to ritonavir.•The effect of DRV/r co-administration on total and trough exposure of twice-daily DTG was investigated.•DRV/r and DTG performance during the highly viraemic phase of acute HIV infection was studied.
To the authors’ knowledge, there is currently no literature or guidance recommendation regarding whether the dose of dolutegravir (DTG) should be increased when co-administered with darunavir/ritonavir (DRV/r) in patients with acute human immunodeficiency virus infection (AHI). This study assessed the pharmacokinetics (PK) of twice-daily (BID) DTG and once-daily (QD) DRV/r, and compared this with DTG QD without DRV/r in patients with AHI. Forty-six participants initiated antiretroviral therapy within <24 h of enrolment: DTG 50 mg BID, DRV/r 800/100 mg QD, and two nucleoside reverse transcriptase inhibitors (NRTIs) for 4 weeks (Phase I); and DTG 50 mg QD with two NRTIs thereafter (Phase II: reference). Total DTG trough concentration (Ctrough) and area under the concentration–time profile of 0–24 h (AUC0–24h) were predicted using a population PK model. DTG glucuronidation metabolic ratio (MR) and DTG free fraction were determined and compared per treatment phase using geometric mean ratio (GMR) and 90% confidence interval (CI). Participants had a predicted geometric mean steady-state DTG Ctrough of 2.83 coefficient of variation (CV%) 30.3% mg/L (Phase I) and 1.28 (CV% 52.4%) mg/L (Phase II), with GMR of 2.20 (90% CI 1.90–2.55). Total exposure during DTG BID increased but did not double AUC0–24h GMR 1.65 (90% CI 1.50–1.81) h.mg/L. DTG glucuronidation MR increased by approximately 29% during Phase I. DTG Ctrough was above in-vivo EC90 (0.32 mg/L) during both phases, except in one participant during Phase I. At Week 8, 84% of participants had viral loads ≤40 copies/mL. The drug–drug interaction between DTG (BID) and DRV/r (QD) was due to induced glucuronidation, and is not clinically relevant in patients with AHI.
The practice of surgical stabilization of rib fractures (SSRF) for severe chest wall injury has exponentially increased over the last decade due to improved outcomes as compared to nonoperative ...management. However, regarding in-hospital outcomes, the ideal time from injury to SSRF remains a matter of debate. This review aims to evaluate and summarize currently available literature related to timing of SSRF. Nine studies on the effect of time to SSRF were identified. All were retrospective comparative studies with no detailed information on why patients underwent early or later SSRF. Patients underwent SSRF most often for a flail chest or ≥3 displaced rib fractures. Early SSRF (≤48–72 hours after admission) was associated with shorter hospital and intensive care unit length of stay (HLOS and ICU-LOS, respectively), duration of mechanical ventilation (DMV), and lower rates of pneumonia, and tracheostomy as well as lower hospitalization costs. No difference between early or late SSRF was demonstrated for mortality rate. As compared to nonoperative management, late SSRF (>3 days after admission), was associated with similar or worse in-hospital outcomes. The optimal time to perform SSRF in patients with severe chest wall injury is early (≤48–72 hours after admission) and associated with improved in-hospital outcomes as compared to either late salvage or nonoperative management. These data must however be cautiously interpreted due the retrospective nature of the studies and potential selection and attrition bias. Future research should focus on both factors and pathways that allow patients to undergo early SSRF.
Summary Background In adults with acute stroke, infections occur commonly and are associated with an unfavourable functional outcome. In the Preventive Antibiotics in Stroke Study (PASS) we aimed to ...establish whether or not preventive antimicrobial therapy with a third-generation cephalosporin, ceftriaxone, improves functional outcome in patients with acute stroke. Methods In this multicentre, randomised, open-label trial with masked endpoint assessment, patients with acute stroke were randomly assigned to intravenous ceftriaxone at a dose of 2 g, given every 24 h intravenously for 4 days, in addition to stroke unit care, or standard stroke unit care without preventive antimicrobial therapy; assignments were made within 24 h after symptom onset. The primary endpoint was functional outcome at 3 months, defined according to the modified Rankin Scale and analysed by intention to treat. The primary analysis was by ordinal regression of the primary outcome. Secondary outcomes included death, infection rates, antimicrobial use, and length of hospital stay. Participants and caregivers were aware of treatment allocation but assessors of outcome were masked to group assignment. This trial is registered with controlled-trials.com , number ISRCTN66140176. Findings Between July 6, 2010, and March 23, 2014, a total of 2550 patients from 30 sites in the Netherlands, including academic and non-academic medical centres, were randomly assigned to the two treatment groups: 1275 patients to ceftriaxone and 1275 patients to standard treatment (control group). 12 patients (seven in the ceftriaxone group and five in the control group) withdrew consent immediately after randomisation, leaving 2538 patients available for the intention-to-treat-analysis (1268 in the ceftriaxone group and 1270 in the control group). 2514 (99%) of 2538 patients (1257 in each group) completed 3-month follow-up. Preventive ceftriaxone did not affect the distribution of functional outcome scores on the modified Rankin Scale at 3 months (adjusted common odds ratio 0·95 95% CI 0·82–1·09, p=0·46). Preventive ceftriaxone did not result in an increased occurrence of adverse events. Overgrowth infection with Clostridium difficile occurred in two patients (<1%) in the ceftriaxone group and none in the control group. Interpretation Preventive ceftriaxone does not improve functional outcome at 3 months in adults with acute stroke. The results of our trial do not support the use of preventive antibiotics in adults with acute stroke. Funding Netherlands Organization for Health Research and Development, Netherlands Heart Foundation, and the European Research Council.
Attribution of the causes of atmospheric trace gas and aerosol variability often requires the use of high resolution time series of anthropogenic and natural emissions inventories. Here we developed ...an approach for representing synoptic‐ and diurnal‐scale temporal variability in fire emissions for the Global Fire Emissions Database version 3 (GFED3). We disaggregated monthly GFED3 emissions during 2003–2009 to a daily time step using Moderate Resolution Imaging Spectroradiometer (MODIS)‐derived measurements of active fires from Terra and Aqua satellites. In parallel, mean diurnal cycles were constructed from Geostationary Operational Environmental Satellite (GOES) Wildfire Automated Biomass Burning Algorithm (WF_ABBA) active fire observations. Daily variability in fires varied considerably across different biomes, with short but intense periods of daily emissions in boreal ecosystems and lower intensity (but more continuous) periods of burning in savannas. These patterns were consistent with earlier field and modeling work characterizing fire behavior dynamics in different ecosystems. On diurnal timescales, our analysis of the GOES WF_ABBA active fires indicated that fires in savannas, grasslands, and croplands occurred earlier in the day as compared to fires in nearby forests. Comparison with Total Carbon Column Observing Network (TCCON) and Measurements of Pollution in the Troposphere (MOPITT) column CO observations provided evidence that including daily variability in emissions moderately improved atmospheric model simulations, particularly during the fire season and near regions with high levels of biomass burning. The high temporal resolution estimates of fire emissions developed here may ultimately reduce uncertainties related to fire contributions to atmospheric trace gases and aerosols. Important future directions include reconciling top‐down and bottom up estimates of fire radiative power and integrating burned area and active fire time series from multiple satellite sensors to improve daily emissions estimates.
Key Points
We developed an approach to distribute daily and hourly fire emissions
Daily and hourly patterns of fire activity varied among different land types
Daily and hourly fire emissions improved CO simulations