INTRODUCTIONHealthcare-associated infections (HAI) are among the most common adverse events of medical care. Surveillance of HAI is a key component of successful infection prevention programmes. ...Conventional surveillance - manual chart review - is resource intensive and limited by concerns regarding interrater reliability. This has led to the development and use of automated surveillance (AS). Many AS systems are the product of in-house development efforts and heterogeneous in their design and methods. With this roadmap, the PRAISE network aims to provide guidance on how to move AS from the research setting to large-scale implementation, and how to ensure the delivery of surveillance data that are uniform and useful for improvement of quality of care. METHODSThe PRAISE network brings together 30 experts from ten European countries. This roadmap is based on the outcome of two workshops, teleconference meetings and review by an independent panel of international experts. RESULTSThis roadmap focuses on the surveillance of HAI within networks of healthcare facilities for the purpose of comparison, prevention and quality improvement initiatives. The roadmap does the following: discusses the selection of surveillance targets, different organizational and methodologic approaches and their advantages, disadvantages and risks; defines key performance requirements of AS systems and suggestions for their design; provides guidance on successful implementation and maintenance; and discusses areas of future research and training requirements for the infection prevention and related disciplines. The roadmap is supported by accompanying documents regarding the governance and information technology aspects of implementing AS. CONCLUSIONSLarge-scale implementation of AS requires guidance and coordination within and across surveillance networks. Transitions to large-scale AS entail redevelopment of surveillance methods and their interpretation, intensive dialogue with stakeholders and the investment of considerable resources. This roadmap can be used to guide future steps towards implementation, including designing solutions for AS and practical guidance checklists.
The appropriate use of facemasks, recommended or mandated by authorities, is critical to prevent the spread of COVID-19 in the community. We aim to evaluate frequency and quality of facemask use in ...general populations.
A multi-site observational study was carried out from June to July 2020 in the west of France. An observer was positioned at a predetermined place, facing a landmark, and all individual passing between the observer and the landmark were included. The observer collected information on facemask use (type, quality of positioning), location and demographic characteristics.
A total of 3354 observations were recorded. A facemask was worn by 56.4% (n = 1892) of individuals, including surgical facemasks (56.8%, n = 1075) and cloth masks (43.2%, n = 817). The facemask was correctly positioned in 75.2% (n = 1422) of cases. The factors independently associated with wearing a facemask were being indoors (adjusted odds ratio aOR, 2.7; 95% confidence interval CI 2.28-3.19), being in a mandatory area (aOR, 6.92; 95% CI 5-9.7), female gender (aOR, 1.75; 95% CI 1.54-2.04), age 41-65 years (aOR, 1.7; 95% CI 1.43-2.02) and age > 65 years (aOR, 2.28; 95% CI 1.83-2.85). The factors independently associated with correct mask position were rural location (aOR, 1.38; 95% CI 1.07-1.79), being in an indoor area (aOR, 1.85; 95% CI 1.49-2.3), use of clothmask (aOR, 1.53; 95% CI 1.23-1.91), and age > 40 years (aOR, 1.75 95%CI 1.37-2.23).
During the initial phase of the COVID-19 pandemic, the frequency and quality of facemask wearing remained low in the community setting. Young people in general, and men in particular, represent the priority targets for information campaigns. Simplifying the rules to require universal mandatory facemasking seemed to be the best approach for health authorities.
Background The duration of gastrointestinal colonization with extended-spectrum β-lactamase-producing Enterobacteriaceae (ESBL-E) may play a major role in the spread of these organisms. We evaluated ...the time to, and factors associated with, ESBL-E clearance after hospital discharge. Methods We retrospectively reviewed prospective surveillance results obtained over 14 years in a 1,000-bed hospital. The surveillance collected demographic, hospital stay, microbiologic, and outcome data. An automatic alert system identified readmitted patients with prior ESBL-E carriage. ESBL-E clearance was defined as a negative rectal screening sample at readmission with no new positive clinical sample during the stay. Variables associated with ESBL-E clearance were identified using a Cox model. Results We included 1,884 patients with 2,734 admissions. Four hundred forty-eight patients with readmission screening formed the basis for the study. Of 448 patients with 1 to 16 readmissions, 180 (40%) were persistent carriers. The median time to ESBL-E clearance was 6.6 months. Variables independently associated with clearance was having the first positive culture in a screening sample only (adjusted hazard ratio, 1.31; 95% confidence interval, 1.02-1.69; P = .04) and period 2005-2010 (hazard ratio, 1.88; 95% confidence interval, 1.33-2.67; P < .01). Conclusion We found a long duration of ESBL-E carriage after hospital discharge. An automatic alert system was useful for identifying, screening, and isolating previous ESBL-E carriers.
Major catheter-related infection includes catheter-related bloodstream infections and clinical sepsis without bloodstream infection resolving after catheter removal with a positive quantitative tip ...culture. Insertion site dressings are a major mean to reduce catheter infections by the extraluminal route. However, the importance of dressing disruptions in the occurrence of major catheter-related infection has never been studied in a large cohort of patients.
A secondary analysis of a randomized multicenter trial was performed in order to determine the importance of dressing disruption on the risk for development of catheter-related bloodstream infection.
Among 1,419 patients (3,275 arterial or central-vein catheters) included, we identified 296 colonized catheters, 29 major catheter-related infections, and 23 catheter-related bloodstream infections. Of the 11,036 dressings changes, 7,347 (67%) were performed before the planned date because of soiling or undressing. Dressing disruption occurred more frequently in patients with higher Sequential Organ Failure Assessment scores and in patients receiving renal replacement therapies; it was less frequent in males and patients admitted for coma. Subclavian access protected from dressing disruption. Dressing cost (especially staff cost) was inversely related to the rate of disruption. The number of dressing disruptions was related to increased risk for colonization of the skin around the catheter at removal (p < .0001). The risk of major catheter-related infection and catheter-related bloodstream infection increased by more than three-fold after the second dressing disruption and by more than ten-fold if the final dressing was disrupted, independently of other risk factors of infection.
Disruption of catheter dressings was common and was an important risk factor for catheter-related infections. These data support the preferential use of the subclavian insertion site and enhanced efforts to reduce dressing disruption in postinsertion bundles of care.
To determine the effect of a 2-yr multifaceted program aimed at preventing ventilator-acquired pneumonia on compliance with eight targeted preventive measures.
Pre- and postintervention observational ...study.
A 20-bed medical intensive care unit in a teaching hospital.
A total of 1649 ventilator-days were observed.
The program involved all healthcare workers and included a multidisciplinary task force, an educational session, direct observations with performance feedback, technical improvements, and reminders. It focused on eight targeted measures based on well-recognized published guidelines, easily and precisely defined acts, and directly concerned healthcare workers' bedside behavior. Compliance assessment consisted of five 4-wk periods (before the intervention and 1 month, 6 months, 12 months, and 24 months thereafter).
Hand-hygiene and glove-and-gown use compliances were initially high (68% and 80%) and remained stable over time. Compliance with all other preventive measures was initially low and increased steadily over time (before 2-yr level, p < .0001): backrest elevation (5% to 58%) and tracheal cuff pressure maintenance (40% to 89%), which improved after simple technical equipment implementation; orogastric tube use (52% to 96%); gastric overdistension avoidance (20% to 68%); good oral hygiene (47% to 90%); and nonessential tracheal suction elimination (41% to 92%). To assess overall performance of the last six preventive measures, using ventilator-days as the unit of analysis, a composite score for preventive measures applied (range, 0-6) was developed. The median (interquartile range) composite scores for the five successive assessments were 2 (1-3), 4 (3-5), 4 (4-5), 5 (4-6), and 5 (4-6) points; they increased significantly over time (p < .0001). Ventilator-acquired pneumonia prevalence rate decreased by 51% after intervention (p < .0001).
Our active, long-lasting program for preventing ventilator-acquired pneumonia successfully increased compliance with preventive measures directly dependent on healthcare workers' bedside performance. The multidimensional framework was critical for this marked, progressive, and sustained change.
Background Spread of resistant bacteria causes severe morbidity and mortality. Stringent control measures can be expensive and disrupt hospital organization. In the present study, we assessed the ...effectiveness and cost-effectiveness of control strategies to prevent the spread of Carbapenemase-producing Enterobacterales (CPE) in a general hospital ward (GW). Methods A dynamic, stochastic model simulated the transmission of CPE by the hands of healthcare workers (HCWs) and the environment in a hypothetical 25-bed GW. Input parameters were based on published data; we assumed the prevalence at admission of 0.1%. 12 strategies were compared to the baseline (no control) and combined different prevention and control interventions: targeted or universal screening at admission (TS or US), contact precautions (CP), isolation in a single room, dedicated nursing staff (DNS) for carriers and weekly screening of contact patients (WSC). Time horizon was one year. Outcomes were the number of CPE acquisitions, costs, and incremental cost-effectiveness ratios (ICER). A hospital perspective was adopted to estimate costs, which included laboratory costs, single room, contact precautions, staff time, i.e. infection control nurse and/or dedicated nursing staff, and lost bed-days due to prolonged hospital stay of identified carriers. The model was calibrated on actual datasets. Sensitivity analyses were performed. Results The baseline scenario resulted in 0.93 CPE acquisitions/1000 admissions and costs 32,050 euro/1000 admissions. All control strategies increased costs and improved the outcome. The efficiency frontier was represented by: (1) TS with DNS at a 17,407 euro/avoided CPE case, (2) TS + DNS + WSC at a 30,700 euro/avoided CPE case and (3) US + DNS + WSC at 181,472 euro/avoided CPE case. Other strategies were dominated. Sensitivity analyses showed that TS + CP might be cost-effective if CPE carriers are identified upon admission or if the cases have a short hospital stay. However, CP were effective only when high level of compliance with hand hygiene was obtained. Conclusions Targeted screening at admission combined with DNS for identified CPE carriers with or without weekly screening were the most cost-effective options to limit the spread of CPE. These results support current recommendations from several high-income countries. Keywords: Cross-transmission, Carbapenemase-producing Enterobacterales, Hand disinfection, Mathematical model, Cost-effectiveness, Control strategies, France
A matched case-control study was performed to identify risk factors for acquiring multidrug-resistant Pseudomonas aeruginosa (MDRPA) in intensive care unit (ICU) patients during a 2-year period. ...MDRPA was defined as P. aeruginosa with combined decreased susceptibility to piperacillin, ceftazidime, imipenem, and ciprofloxacin. Thirty-seven patients who were colonized or infected with MDRPA were identified, 34 of whom were matched with 34 control patients who had cultures that showed no growth of P. aeruginosa. Matching criteria were severity of illness and length of ICU stay, with each control patient staying in the ICU for at least as long as the time period between the corresponding case patient's admission to the ICU and the acquisition of MDRPA. Baseline demographic and clinical characteristics and the use of invasive procedures were similar for case patients and control patients. Multivariate analysis identified duration of ciprofloxacin treatment as an independent risk factor for MDRPA acquisition, whereas the duration of treatment with imipenem was of borderline significance. These data support a major role for the use of antibiotics with high antipseudomonal activity, particularly ciprofloxacin, in the emergence of MDRPA.
The aims of this study were, first, to identify risk factors for microbiology-proven postoperative pneumonia after cardiac surgery and, second, to develop and validate a preoperative scoring system ...for the risk of postoperative pneumonia.
A single-center cohort study.
All consecutive patients undergoing cardiac surgery between January 2006 and July 2011.
None.
Multivariate analysis of risk factors for postoperative pneumonia was performed on data from patients operated between January 2006 and December 2008 (training set). External temporal validation was performed on data from patients operated between January 2009 and July 2011 (validation set). Preoperative variables identified in multivariate analysis of the training set were then used to develop a preoperative scoring system that was validated on the validation set. Postoperative pneumonia occurred in 174 of the 5,582 patients (3.1%; 95% CI, 2.7-3.6). Multivariate analysis identified four risk factors for postoperative pneumonia: age (odds ratio, 1.02; 95% CI, 1.01-1.03), chronic obstructive pulmonary disease (odds ratio, 2.97; 95% CI, 1.8-4.71), preoperative left ventricular ejection fraction (odds ratio, 0.98; 95% CI, 0.96-0.99), and the interaction between RBC transfusion during surgery and duration of cardiopulmonary bypass (odds ratio, 2.98; 95% CI, 1.96-4.54). A 6-point score including the three preoperative variables then defined two risk groups corresponding to postoperative pneumonia rates of 1.8% (score < 3) and 6.5% (score ≥ 3).
Assessing preoperative risk factors for postoperative pneumonia with the proposed scoring system could help to implement a preventive policy in high-risk patients with a risk of postoperative pneumonia greater than 4% (i.e., patients with a score ≥ 3).