Summary Background The control of Clostridium difficile infections is an international clinical challenge. The incidence of C difficile in England declined by roughly 80% after 2006, following the ...implementation of national control policies; we tested two hypotheses to investigate their role in this decline. First, if C difficile infection declines in England were driven by reductions in use of particular antibiotics, then incidence of C difficile infections caused by resistant isolates should decline faster than that caused by susceptible isolates across multiple genotypes. Second, if C difficile infection declines were driven by improvements in hospital infection control, then transmitted (secondary) cases should decline regardless of susceptibility. Methods Regional (Oxfordshire and Leeds, UK) and national data for the incidence of C difficile infections and antimicrobial prescribing data (1998–2014) were combined with whole genome sequences from 4045 national and international C difficile isolates. Genotype (multilocus sequence type) and fluoroquinolone susceptibility were determined from whole genome sequences. The incidence of C difficile infections caused by fluoroquinolone-resistant and fluoroquinolone-susceptible isolates was estimated with negative-binomial regression, overall and per genotype. Selection and transmission were investigated with phylogenetic analyses. Findings National fluoroquinolone and cephalosporin prescribing correlated highly with incidence of C difficile infections (cross-correlations >0·88), by contrast with total antibiotic prescribing (cross-correlations <0·59). Regionally, C difficile decline was driven by elimination of fluoroquinolone-resistant isolates (approximately 67% of Oxfordshire infections in September, 2006, falling to approximately 3% in February, 2013; annual incidence rate ratio 0·52, 95% CI 0·48–0·56 vs fluoroquinolone-susceptible isolates: 1·02, 0·97–1·08). C difficile infections caused by fluoroquinolone-resistant isolates declined in four distinct genotypes (p<0·01). The regions of phylogenies containing fluoroquinolone-resistant isolates were short-branched and geographically structured, consistent with selection and rapid transmission. The importance of fluoroquinolone restriction over infection control was shown by significant declines in inferred secondary (transmitted) cases caused by fluoroquinolone-resistant isolates with or without hospital contact (p<0·0001) versus no change in either group of cases caused by fluoroquinolone-susceptible isolates (p>0·2). Interpretation Restricting fluoroquinolone prescribing appears to explain the decline in incidence of C difficile infections, above other measures, in Oxfordshire and Leeds, England. Antimicrobial stewardship should be a central component of C difficile infection control programmes. Funding UK Clinical Research Collaboration (Medical Research Council, Wellcome Trust, National Institute for Health Research); NIHR Oxford Biomedical Research Centre; NIHR Health Protection Research Unit on Healthcare Associated Infection and Antimicrobial Resistance (Oxford University in partnership with Public Health England PHE), and on Modelling Methodology (Imperial College, London in partnership with PHE); and the Health Innovation Challenge Fund.
Routine full characterization of
is culture based, taking many weeks. Whole-genome sequencing (WGS) can generate antibiotic susceptibility profiles to inform treatment, augmented with strain ...information for global surveillance; such data could be transformative if provided at or near the point of care. We demonstrate a low-cost method of DNA extraction directly from patient samples for
WGS. We initially evaluated the method by using the Illumina MiSeq sequencer (40 smear-positive respiratory samples obtained after routine clinical testing and 27 matched liquid cultures).
was identified in all 39 samples from which DNA was successfully extracted. Sufficient data for antibiotic susceptibility prediction were obtained from 24 (62%) samples; all results were concordant with reference laboratory phenotypes. Phylogenetic placement was concordant between direct and cultured samples. With Illumina MiSeq/MiniSeq, the workflow from patient sample to results can be completed in 44/16 h at a reagent cost of £96/£198 per sample. We then employed a nonspecific PCR-based library preparation method for sequencing on an Oxford Nanopore Technologies MinION sequencer. We applied this to cultured
strain BCG DNA and to combined culture-negative sputum DNA and BCG DNA. For flow cell version R9.4, the estimated turnaround time from patient to identification of BCG, detection of pyrazinamide resistance, and phylogenetic placement was 7.5 h, with full susceptibility results 5 h later. Antibiotic susceptibility predictions were fully concordant. A critical advantage of MinION is the ability to continue sequencing until sufficient coverage is obtained, providing a potential solution to the problem of variable amounts of
DNA in direct samples.
The rise of antibiotic-resistant bacteria has led to an urgent need for rapid detection of drug resistance in clinical samples, and improvements in global surveillance. Here we show how de Bruijn ...graph representation of bacterial diversity can be used to identify species and resistance profiles of clinical isolates. We implement this method for Staphylococcus aureus and Mycobacterium tuberculosis in a software package ('Mykrobe predictor') that takes raw sequence data as input, and generates a clinician-friendly report within 3 minutes on a laptop. For S. aureus, the error rates of our method are comparable to gold-standard phenotypic methods, with sensitivity/specificity of 99.1%/99.6% across 12 antibiotics (using an independent validation set, n=470). For M. tuberculosis, our method predicts resistance with sensitivity/specificity of 82.6%/98.5% (independent validation set, n=1,609); sensitivity is lower here, probably because of limited understanding of the underlying genetic mechanisms. We give evidence that minor alleles improve detection of extremely drug-resistant strains, and demonstrate feasibility of the use of emerging single-molecule nanopore sequencing techniques for these purposes.
Clostridium difficile infection (CDI) is a leading cause of antibiotic-associated diarrhoea and is endemic in hospitals, hindering the identification of sources and routes of transmission based on ...shared time and space alone. This may compromise rational control despite costly prevention strategies. This study aimed to investigate ward-based transmission of C. difficile, by subdividing outbreaks into distinct lineages defined by multi-locus sequence typing (MLST).
All C. difficile toxin enzyme-immunoassay-positive and culture-positive samples over 2.5 y from a geographically defined population of ~600,000 persons underwent MLST. Sequence types (STs) were combined with admission and ward movement data from an integrated comprehensive healthcare system incorporating three hospitals (1,700 beds) providing all acute care for the defined geographical population. Networks of cases and potential transmission events were constructed for each ST. Potential infection sources for each case and transmission timescales were defined by prior ward-based contact with other cases sharing the same ST. From 1 September 2007 to 31 March 2010, there were means of 102 tests and 9.4 CDIs per 10,000 overnight stays in inpatients, and 238 tests and 15.7 CDIs per month in outpatients/primary care. In total, 1,276 C. difficile isolates of 69 STs were studied. From MLST, no more than 25% of cases could be linked to a potential ward-based inpatient source, ranging from 37% in renal/transplant, 29% in haematology/oncology, and 28% in acute/elderly medicine to 6% in specialist surgery. Most of the putative transmissions identified occurred shortly (≤ 1 wk) after the onset of symptoms (141/218, 65%), with few >8 wk (21/218, 10%). Most incubation periods were ≤ 4 wk (132/218, 61%), with few >12 wk (28/218, 13%). Allowing for persistent ward contamination following ward discharge of a CDI case did not increase the proportion of linked cases after allowing for random meeting of matched controls.
In an endemic setting with well-implemented infection control measures, ward-based contact with symptomatic enzyme-immunoassay-positive patients cannot account for most new CDI cases.
Background. Despite substantial interest in biomarkers, their impact on clinical outcomes and variation with bacterial strain has rarely been explored using integrated databases. Methods. From ...September 2006 to May 2011, strains isolated from Clostridium difficile toxin enzyme immunoassay (EIA)—positive fecal samples from Oxfordshire, United Kingdom (approximately 600 000 people) underwent multilocus sequence typing. Fourteen-day mortality and levels of 15 baseline biomarkers were compared between consecutive C. difficile infections (CDIs) from different clades/sequence types (STs) and EIA-negative controls using Cox and normal regression adjusted for demographic/clinical factors. Results. Fourteen-day mortality was 13% in 2222 adults with 2745 EIA-positive samples (median, 78 years) vs 5% in 20 722 adults with 27 550 EIA-negative samples (median, 74 years) (absolute attributable mortality, 7.7%; 95% CI, 6.4%–9.0%). Mortality was highest in clade 5 CDIs (25% 16 of 63; polymerase chain reaction (PCR) ribotype 078/ST 11), then clade 2 (20% 111 of 560; 99% PCR ribotype 027/ST 1) versus clade 1 (12% 137 of 1168; adjusted P < .0001). Within clade 1, 14-day mortality was only 4% (3 of 84) in ST 44 (PCR ribotype 015) (adjusted P = .05 vs other clade 1). Mean baseline neutrophil counts also varied significantly by genotype: 12.4, 11.6, and 9.5 × 10 9 neutrophils/L for clades 5, 2 and 1, respectively, vs 7.0 × 10 9 neutrophils/L in EIA-negative controls (P < .0001) and 7.9 × 10 9 neutrophils/L in ST 44 (P = .08). There were strong associations between C. difficile-type-specific effects on mortality and neutrophil/white cell counts (rho = 0.48), C-reactive-protein (rho = 0.43), eosinophil counts (rho = −0.45), and serum albumin (rho = −0.47). Biomarkers predicted 30%–40% of clade-specific mortality differences. Conclusions. C. difficile genotype predicts mortality, and excess mortality correlates with genotype-specific changes in biomarkers, strongly implicating inflammatory pathways as a major influence on poor outcome after CDI. PCR ribotype 078/ST 11 (clade 5) leads to severe CDI; thus ongoing surveillance remains essential.
Escherichia coli bloodstream infections are increasing in the UK and internationally. The evidence base to guide interventions against this major public health concern is small. We aimed to ...investigate possible drivers of changes in the incidence of E coli bloodstream infection and antibiotic susceptibilities in Oxfordshire, UK, over the past two decades, while stratifying for time since hospital exposure.
In this observational study, we used all available data on E coli bloodstream infections and E coli urinary tract infections (UTIs) from one UK region (Oxfordshire) using anonymised linked microbiological data and hospital electronic health records from the Infections in Oxfordshire Research Database (IORD). We estimated the incidence of infections across a two decade period and the annual incidence rate ratio (aIRR) in 2016. We modelled the data using negative binomial regression on the basis of microbiological, clinical, and health-care-exposure risk factors. We investigated infection severity, 30-day all-cause mortality, and community and hospital amoxicillin plus clavulanic acid (co-amoxiclav) use to estimate changes in bacterial virulence and the effect of antimicrobial resistance on incidence.
From Jan 1, 1998, to Dec 31, 2016, 5706 E coli bloodstream infections occurred in 5215 patients, and 228 376 E coli UTIs occurred in 137 075 patients. 1365 (24%) E coli bloodstream infections were nosocomial (onset >48 h after hospital admission), 1132 (20%) were quasi-nosocomial (≤30 days after discharge), 1346 (24%) were quasi-community (31–365 days after discharge), and 1863 (33%) were community (>365 days after hospital discharge). The overall incidence increased year on year (aIRR 1·06, 95% CI 1·05–1·06). In 2016, 212 (41%) of 515 E coli bloodstream infections and 3921 (28%) of 13 792 E coli UTIs were co-amoxiclav resistant. Increases in E coli bloodstream infections were driven by increases in community (aIRR 1·10, 95% CI 1·07–1·13; p<0·0001) and quasi-community (aIRR 1·08, 1·07–1·10; p<0·0001) cases. 30-day mortality associated with E coli bloodstream infection decreased over time in the nosocomial (adjusted rate ratio RR 0·98, 95% CI 0·96–1·00; p=0·03) group, and remained stable in the quasi-nosocomial (adjusted RR 0·98, 0·95–1·00; p=0·06), quasi-community (adjusted RR 0·99, 0·96–1·01; p=0·32), and community (adjusted RR 0·99, 0·96–1·01; p=0·21) groups. Mortality was, however, substantial at 14–25% across all hospital-exposure groups. Co-amoxiclav-resistant E coli bloodstream infections increased in all groups across the study period (by 11–18% per year, significantly faster than co-amoxiclav-susceptible E coli bloodstream infections; pheterogeneity<0·0001), as did co-amoxiclav-resistant E coli UTIs (by 14–29% per year; pheterogeneity<0·0001). Previous year co-amoxiclav use in primary-care facilities was associated with increased subsequent year community co-amoxiclav-resistant E coli UTIs (p=0·003).
Increases in E coli bloodstream infections in Oxfordshire are primarily community associated, with substantial co-amoxiclav resistance; nevertheless, we found little or no change in mortality. Focusing interventions on primary care facilities, particularly those with high co-amoxiclav use, could be effective in reducing the incidence of co-amoxiclav-resistant E coli bloodstream infections, in this region and more generally.
National Institute for Health Research.
Summary Background Weekend hospital admission is associated with increased mortality, but the contributions of varying illness severity and admission time to this weekend effect remain unexplored. ...Methods We analysed unselected emergency admissions to four Oxford University National Health Service hospitals in the UK from Jan 1, 2006, to Dec 31, 2014. The primary outcome was death within 30 days of admission (in or out of hospital), analysed using Cox models measuring time from admission. The primary exposure was day of the week of admission. We adjusted for multiple confounders including demographics, comorbidities, and admission characteristics, incorporating non-linearity and interactions. Models then considered the effect of adjusting for 15 common haematology and biochemistry test results or proxies for hospital workload. Findings 257 596 individuals underwent 503 938 emergency admissions. 18 313 (4·7%) patients admitted as weekday energency admissions and 6070 (5·1%) patients admitted as weekend emergency admissions died within 30 days (p<0·0001). 9347 individuals underwent 9707 emergency admissions on public holidays. 559 (5·8%) died within 30 days (p<0·0001 vs weekday). 15 routine haematology and biochemistry test results were highly prognostic for mortality. In 271 465 (53·9%) admissions with complete data, adjustment for test results explained 33% (95% CI 21 to 70) of the excess mortality associated with emergency admission on Saturdays compared with Wednesdays, 52% (lower 95% CI 34) on Sundays, and 87% (lower 95% CI 45) on public holidays after adjustment for standard patient characteristics. Excess mortality was predominantly restricted to admissions between 1100 h and 1500 h (pinteraction =0·04). No hospital workload measure was independently associated with mortality (all p values >0·06). Interpretation Adjustment for routine test results substantially reduced excess mortality associated with emergency admission at weekends and public holidays. Adjustment for patient-level factors not available in our study might further reduce the residual excess mortality, particularly as this clustered around midday at weekends. Hospital workload was not associated with mortality. Together, these findings suggest that the weekend effect arises from patient-level differences at admission rather than reduced hospital staffing or services. Funding NIHR Oxford Biomedical Research Centre.
Use of whole-genome sequencing (WGS) for routine mycobacterial species identification and drug susceptibility testing (DST) is becoming a reality. We compared the performances of WGS and standard ...laboratory workflows prospectively, by parallel processing at a major mycobacterial reference service over the course of 1 year, for species identification, first-line
resistance prediction, and turnaround time. Among 2,039 isolates with line probe assay results for species identification, 74 (3.6%) failed sequencing or WGS species identification. Excluding these isolates, clinically important species were identified for 1,902 isolates, of which 1,825 (96.0%) were identified as the same species by WGS and the line probe assay. A total of 2,157 line probe test results for detection of resistance to the first-line drugs isoniazid and rifampin were available for 728
complex isolates. Excluding 216 (10.0%) cases where there were insufficient sequencing data for WGS to make a prediction, overall concordance was 99.3% (95% confidence interval CI, 98.9 to 99.6%), sensitivity was 97.6% (91.7 to 99.7%), and specificity was 99.5% (99.0 to 99.7%). A total of 2,982 phenotypic DST results were available for 777
complex isolates. Of these, 356 (11.9%) had no WGS comparator due to insufficient sequencing data, and in 154 (5.2%) cases the WGS prediction was indeterminate due to discovery of novel, previously uncharacterized mutations. Excluding these data, overall concordance was 99.2% (98.7 to 99.5%), sensitivity was 94.2% (88.4 to 97.6%), and specificity was 99.4% (99.0 to 99.7%). Median processing times for the routine laboratory tests versus WGS were similar overall, i.e., 20 days (interquartile range IQR, 15 to 31 days) and 21 days (15 to 29 days), respectively (
= 0.41). In conclusion, WGS predicts species and drug susceptibility with great accuracy, but work is needed to increase the proportion of predictions made.
UK Biobank (UKB) is an international health resource enabling research into the genetic and lifestyle determinants of common diseases of middle and older age. It comprises 500 000 participants. ...Public Health England's Second Generation Surveillance System is a centralized microbiology database covering English clinical diagnostics laboratories that provides national surveillance of legally notifiable infections, bacterial isolations and antimicrobial resistance. We previously developed secure, pseudonymized, individual-level linkage of these systems. In this study, we implemented rapid dynamic linkage, which allows us to provide a regular feed of new COVID-19 (SARS-CoV-2) test results to UKB to facilitate rapid and urgent research into the epidemiological and human genetic risk factors for severe infection in the cohort. Here, we have characterized the first 1352 cases of COVID-19 in UKB participants, of whom 895 met our working definition of severe COVID-19 as inpatients hospitalized on or after 16 March 2020. We found that the incidence of severe COVID-19 among UKB cases was 27.4 % lower than the general population in England, although this difference varied significantly by age and sex. The total number of UKB cases could be estimated as 0.6 % of the publicly announced number of cases in England. We considered how increasing case numbers will affect the power of genome-wide association studies. This new dynamic linkage system has further potential to facilitate the investigation of other infections and the prospective collection of microbiological cultures to create a microbiological biobank (bugbank) for studying the interaction of environment, human and microbial genetics on infection in the UKB cohort.
A number of pathogens induce immature dendritic cells (iDC) to migrate to lymphoid organs where, as mature DC (mDC), they serve as efficient APC. We hypothesized that pathogen recognition by iDC is ...mediated by Toll-like receptors (TLRs), and asked which TLRs are expressed during the progression of monocytes to mDC. We first measured mRNA levels for TLRs 1-5 and MD2 (a protein required for TLR4 function) by Northern analysis. For most TLRs, message expression decreased severalfold as monocytes differentiated into iDC, but opposing this trend, TLR3 and MD2 showed marked increases during iDC formation. When iDC were induced to mature with LPS or TNF-alpha, expression of most TLRs transiently increased and then nearly disappeared. Stimulation of iDC, but not mDC, with LPS resulted in the activation of IL-1 receptor-associated kinase, an early component in the TLR signaling pathway, strongly suggesting that LPS signals through a TLR. Surface expression of TLRs 1 and 4, as measured by mAb binding, was very low, corresponding to a few thousand molecules per cell in monocytes, and a few hundred or less in iDC. We conclude that TLRs are expressed in iDC and are involved in responses to at least one pathogen-derived substance, LPS. If TLR4 is solely responsible for LPS signaling in humans, as it is in mice, then its extremely low surface expression implies that it is a very efficient signal transducer in iDC.