ABSTRACT Background The clinical and molecular epidemiology of health care–associated Clostridium difficile infection in nonepidemic settings across Canada has evolved since the first report of the ...virulent North American pulsed-field gel electrophoresis type 1 (NAP1) strain more than 15 years ago. The objective of this national, multicentre study was to describe the evolving epidemiology and molecular characteristics of health care–associated C. difficile infection in Canada during a post-NAP1-epidemic period, particularly patient outcomes associated with the NAP1 strain. Methods Adult inpatients with C. difficile infection were prospectively identified, using a standard definition, between 2009 and 2015 through the Canadian Nosocomial Infection Surveillance Program (CNISP), a network of 64 acute care hospitals. Patient demographic characteristics, severity of infection and outcomes were reviewed. Molecular testing was performed on isolates, and strain types were analyzed against outcomes and epidemiologic trends. Results Over a 7-year period, 20 623 adult patients admitted to hospital with health care–associated C. difficile infection were reported to CNISP, and microbiological data were available for 2690 patients. From 2009 to 2015, the national rate of health care–associated C. difficile infection decreased from 5.9 to 4.3 per 10 000 patient-days. NAP1 remained the dominant strain type, but infection with this strain has significantly decreased over time, followed by an increasing trend of infection with NAP4 and NAP11 strains. The NAP1 strain was significantly associated with a higher rate of death attributable to C. difficile infection compared with non-NAP1 strains (odds ratio 1.91, 95% confidence interval CI 1.29–2.82). Isolates were universally susceptible to metronidazole; one was nonsusceptible to vancomycin. The proportion of NAP1 strains within individual centres predicted their rates of health care–associated C. difficile infection; for every 10% increase in the proportion of NAP1 strains, the rate of health care–associated C. difficile infection increased by 3.3% (95% CI 1.7%–4.9%). Interpretation Rates of health care–associated C. difficile infection have decreased across Canada. In nonepidemic settings, NAP4 has emerged as a common strain type, but NAP1, although decreasing, continues to be the predominant circulating strain and remains significantly associated with higher attributable mortality.
Background Variation in use of damage control (DC) surgery across trauma centers may be partially driven by surgeon uncertainty as to when it is appropriately indicated. We sought to determine ...opinions of practicing surgeons on the appropriateness of published indications for trauma DC surgery. Study Design We asked 384 trauma centers in the United States, Canada, and Australasia to nominate 1 to 3 surgeons at their center to participate in a survey about DC surgery. We then asked nominated surgeons their opinions on the appropriateness (benefit-to-harm ratio) of 43 literature-derived indications for use of DC surgery in adult civilian trauma patients. Results In total, 232 (64.8%) trauma centers nominated 366 surgeons, of whom 201 (56.0%) responded. Respondents rated 15 (78.9%) preoperative and 23 (95.8%) intraoperative indications to be appropriate. Indications respondents agreed had the greatest expected benefit included a temperature <34°C, arterial pH <7.2, and laboratory-confirmed (international normalized ratio/prothrombin time and/or partial thromboplastin time >1.5 times normal) or clinically observed coagulopathy in the pre- or intraoperative setting; administration of >10 units of packed red blood cells; requirement for a resuscitative thoracotomy in the emergency department; and identification of a juxtahepatic venous injury or devascularized or destroyed pancreas, duodenum, or pancreaticoduodenal complex during operation. Ratings were consistent across subgroups of surgeons with different training, experience, and practice settings. Conclusions We identified 38 indications that practicing surgeons agreed appropriately justified the use of DC surgery. Until further studies become available, these indications constitute a consensus opinion that can be used to guide practice in the current era of changing trauma resuscitation practices.
Postinfectious cough is common in primary care, but has no proven effective treatments. Cysteinyl leukotrienes are involved in the pathogenesis of postinfectious cough and whooping cough (pertussis). ...We investigated the effectiveness of montelukast, a cysteinyl leukotriene receptor antagonist, in the treatment of postinfectious cough.
In this randomised, placebo-controlled trial, non-smoking adults aged 16-49 years with postinfectious cough of 2-8 weeks' duration were recruited from 25 general practices in England. Patients were tested for pertussis (oral fluid anti-pertussis toxin IgG) and randomly assigned (1:1) to montelukast 10 mg daily or image-matched placebo for 2 weeks. Patients chose whether to continue study drug for another 2 weeks. The randomisation sequence was computer-generated and stratified by general practice. Patients, health-care professionals, and researchers were masked to treatment allocation. Effectiveness was assessed with the Leicester Cough Questionnaire to measure changes in cough-specific quality of life; the primary outcomes were changes in total score between baseline and two follow-up stages (2 weeks and 4 weeks). The primary analysis was by intention to treat with imputation by last observation carried forward. Recruitment closed on Sept 21, 2012, and follow-up has been completed. This trial is registered with EudraCT (2010-019647-19), UKCRN Portfolio (ID 8360), and ClinicalTrials.gov (NCT01279668).
From April 13, 2011, to Sept 21, 2012, we randomly assigned 276 patients to montelukast (n=137) or placebo (n=139). 70 (25%) patients had laboratory-confirmed pertussis. Improvements in cough-specific quality of life occurred in both groups after 2 weeks (montelukast: mean 2·7, 95% CI 2·2-3·3; placebo: 3·6, 2·9-4·3), but the difference between groups did not meet the minimum clinically important difference of 1·3 (mean difference -0·9, -1·7 to -0·04, p=0·04). This difference was not statistically significant in any sensitivity analyses. After 2 weeks, 192 of 259 participants from whom data were available elected to continue study drug (99 77% of 129 participants on montelukast; 93 72% of 130 on placebo). After 4 weeks, there were no significant between-group differences in cough-specific quality of life improvement (montelukast: 5·2, 4·5-5·9; placebo: 5·9, 5·1-6·7; mean difference -0·5, -1·5 to 0·6, p=0·38) or adverse event rates (21 (15%) of 137 patients on montelukast reported one or more adverse events; 31 (22%) of 139 on placebo; p=0·14). The most common adverse events reported were increased mucus production (montelukast, n=6; placebo, n=2), gastrointestinal disturbance (montelukast, n=3; placebo, n=5), and headache (montelukast, n=2; placebo, n=6). One serious adverse event was reported (placebo, n=1), which was unrelated to study drug (shortness of breath and throat tightness after severe coughing bouts).
Montelukast is not an effective treatment for postinfectious cough. However, the burden of postinfectious cough in primary care is high, making it an ideal setting for future antitussive treatment trials.
National Institute for Health Research School for Primary Care Research, UK.
Abstract Study Objectives Variation in computed tomography (CT) use between emergency medicine (EM) physicians may delineate appropriate or inappropriate use. We hypothesize that variation in all ...types of CT use exists between providers and their use in patients with common chief concerns. We determine EM physicians' variability in CT use of all types and whether high use in one area predicts use of other CT types. Methods This was a retrospective study of EM physicians practicing at an 800-bed tertiary level 1 trauma center over a 3.5-year period. Computed tomography rates by type and by patient chief concern were modeled for providers as a function of patient acuity, disposition, age, and time of day using logistic regression. Results Of 195 801 eligible visits, 44 724 visits resulted in at least 1 CT scan. The adjusted rate of CT ordering by providers was 23.8% of patient visits, ranging from 11.5% to 32.7% The upper quartile of providers was responsible for 78% of the CT scans ordered above the mean. There was a large variation in use of all types of CT and by chief concern. There was an 8-fold variation in use of CT abdomen in discharged patients. High head CT use by providers predicts high use in all other CT types. Conclusion We demonstrate a dramatic variation in CT use among EM physicians in all types of CT and common chief concerns. Greater variation was present in patients who were discharged. Large deviation from the mean by a group of providers may suggest inappropriate use.
To examine the phenotypes of 8 patients with evidence of cone dysfunction and normal color vision (characteristic features of both oligocone trichromacy and bradyopsia), and subsequently to screen ...RGS9 and R9AP for disease-causing mutations.
Retrospective case series.
Eight affected individuals from 7 families.
Ophthalmologic examination, color vision testing, fundus photography, and detailed electrophysiologic assessment were undertaken. Blood samples were taken for DNA extraction from affected subjects and, where possible, unaffected relatives. Mutation screening of RGS9 and R9AP was performed.
Detailed clinical, electrophysiologic, and molecular genetic findings.
All 8 patients had normal ocular examination results, with visual acuity ranging from 6/12 to 6/18. Four subjects were found to harbor mutations in RGS9 or R9AP, with 3 of the identified sequence variants being novel. Three subjects, 2 Pakistani sisters and an Afghani female, had mutations in R9AP. A novel homozygous nonsense mutation, p.G205fs, was identified in the simplex case, and a second novel homozygous in-frame deletion, p.D32_Q34del, was found in the 2 sisters. The remaining patient, a British male, had a compound heterozygous mutation in RGS9 (p.R128X/p.W299R). The mutation p.R128X represents the first nonsense mutation reported in RGS9. The 4 mutation-positive subjects had concordant characteristic previously described electrophysiologic findings that were not present in the 4 individuals in whom mutations were not identified. Novel findings associated with these mutation-positive patients included that they all showed electroretinogram evidence of severe cone system dysfunction under photopic conditions but normal cone function to a red flash under scotopic conditions. Such findings seem unique for the disorder.
This is the first report describing a nonsense mutation in RGS9. We have established novel electrophysiologic observations associated with RGS9 and R9AP mutations, including those relating to dark-adapted cone function and S-cone function. Patients with either RGS9/R9AP mutations (bradyopsia) or oligocone trichromacy have very similar clinical phenotypes, characterized by stationary cone dysfunction, mild photophobia, normal color vision, lack of nystagmus, and normal fundi. The distinctive electrophysiologic features associated with RGS9 and R9AP mutations enable directed genetic screening.
The author(s) have no proprietary or commercial interest in any materials discussed in this article.
Abstract Background Juvenile onset retinal dystrophy presents at age 16 years or younger. It is genetically heterogeneous, and causal genes can be associated with distinctive clinical features. Next ...generation sequencing techniques including whole-exome sequencing (WES) provide an unbiased platform for investigating patients. The aim of this study was to perform WES on patients with early onset retinal dystrophy and conduct in-depth phenotyping to determine key clinical and molecular characteristics and correlations. Methods Patients were ascertained from the inherited retinal clinics of a large tertiary referral centre. WES was performed on more than 100 probands. Likely pathogenic variants were confirmed with Sanger sequencing and family segregation. All patients underwent clinical examination, retinal imaging and electrophysiological testing, and targeted systemic investigations. Written, informed consent was obtained from all participants. The study received approval from the local ethics committee. Findings Unusual features of genes previously reported to have systemic manifestations were observed in 12 families. These features included IFT140 -related retinal dystrophy in two families with no skeletal or renal manifestations of IFT140 -related Mainzer-Saldino syndrome, COL18A1 -related retinal dystrophy in a patient with normal neuroradiological imaging (unexpected for Knobloch syndrome), and IQCB1 in two patients with leber congenital amaurosis but normal renal function. In addition, identification of mutations in HPS6 in a family thought to have isolated foveal hypoplasia prompted further investigations that identified a bleeding diathesis consistent with Hermansky-Pudlak syndrome. Interpretation Early molecular diagnosis in early onset retinal dystrophy facilitates genetic counselling and directs appropriate investigations. WES provides an unbiased method for investigating patients compared with candidate gene sequencing and can identify atypical phenotypic presentations of syndromic genes as described in this cohort of patients. Thus, patient-specific investigations for potential systemic complications can be performed. Funding National Institute for Health Research Biomedical Research Centre at Moorfields Eye Hospital and UCL Institute of Ophthalmology, Foundation Fighting Blindness, Fight For Sight, Moorfields Eye Hospital Special Trustees, Rosetrees Trust, Foundation Fighting Blindness Career Development Award (MM).
Clinical guidelines vary with respect to the optimal monitoring frequency of HIV-positive individuals. We compared dynamic monitoring strategies based on time-varying CD4 cell counts in virologically ...suppressed HIV-positive individuals.
In this observational study, we used data from prospective studies of HIV-positive individuals in Europe (France, Greece, the Netherlands, Spain, Switzerland, and the UK) and North and South America (Brazil, Canada, and the USA) in The HIV-CAUSAL Collaboration and The Centers for AIDS Research Network of Integrated Clinical Systems. We compared three monitoring strategies that differ in the threshold used to measure CD4 cell count and HIV RNA viral load every 3-6 months (when below the threshold) or every 9-12 months (when above the threshold). The strategies were defined by the threshold CD4 counts of 200 cells per μL, 350 cells per μL, and 500 cells per μL. Using inverse probability weighting to adjust for baseline and time-varying confounders, we estimated hazard ratios (HRs) of death and of AIDS-defining illness or death, risk ratios of virological failure, and mean differences in CD4 cell count.
47 635 individuals initiated an antiretroviral therapy regimen between Jan 1, 2000, and Jan 9, 2015, and met the eligibility criteria for inclusion in our study. During follow-up, CD4 cell count was measured on average every 4·0 months and viral load every 3·8 months. 464 individuals died (107 in threshold 200 strategy, 157 in threshold 350, and 200 in threshold 500) and 1091 had AIDS-defining illnesses or died (267 in threshold 200 strategy, 365 in threshold 350, and 459 in threshold 500). Compared with threshold 500, the mortality HR was 1·05 (95% CI 0·86-1·29) for threshold 200 and 1·02 (0·91·1·14) for threshold 350. Corresponding estimates for death or AIDS-defining illness were 1·08 (0·95-1·22) for threshold 200 and 1·03 (0·96-1·12) for threshold 350. Compared with threshold 500, the 24 month risk ratios of virological failure (viral load more than 200 copies per mL) were 2·01 (1·17-3·43) for threshold 200 and 1·24 (0·89-1·73) for threshold 350, and 24 month mean CD4 cell count differences were 0·4 (-25·5 to 26·3) cells per μL for threshold 200 and -3·5 (-16·0 to 8·9) cells per μL for threshold 350.
Decreasing monitoring to annually when CD4 count is higher than 200 cells per μL compared with higher than 500 cells per μL does not worsen the short-term clinical and immunological outcomes of virally suppressed HIV-positive individuals. However, more frequent virological monitoring might be necessary to reduce the risk of virological failure. Further follow-up studies are needed to establish the long-term safety of these strategies.
National Institutes of Health.