In this study, 1208 Campylobacter jejuni and C. coli isolates from humans and 400 isolates from chicken, collected in two separate periods over 12 years in The Netherlands, were typed using ...multilocus sequence typing (MLST). Statistical evidence was found for a shift of ST frequencies in human isolates over time. The human MLST data were also compared to published data from other countries to determine geographical variation. Because only MLST typed data from chicken, taken from the same time point and spatial location, were available in addition to the human data, MLST datasets for other Campylobacter reservoirs from selected countries were used. The selection was based on the degree of similarity of the human isolates between countries. The main aim of this study was to better understand the consequences of using non-local or non-recent MLST data for attributing domestically acquired human Campylobacter infections to specific sources of origin when applying the asymmetric island model for source attribution. In addition, a power-analysis was done to find the minimum number of source isolates needed to perform source attribution using an asymmetric island model. This study showed that using source data from other countries can have a significant biasing effect on the attribution results so it is important to carefully select data if the available local data lack in quality and/or quantity. Methods aimed at reducing this bias were proposed.
Quantitative microbiological risk assessment (QMRA) allows evaluating the public health impact of food safety targets to support the control of foodborne pathogens. We estimate the risk reduction of ...setting microbiological criteria (MCs) for Campylobacter on broiler meat in 25 European countries, applying quantitative data from the 2008 EU baseline survey. We demonstrate that risk based MCs can be derived without explicit consideration of Food Safety Objectives or Performance Objectives. Published QMRA models for the consumer phase and dose response provide a relation between Campylobacter concentration on skin samples and the attending probability of illness for the consumer. Probabilistic modelling is used to evaluate a set of potential MCs. We present the percentage of batches not complying with the potential criteria, in relation to the risk reduction attending totally efficient treatment of these batches. We find different risk estimates and different impacts of MCs in different countries, which offers a practical and flexible tool for risk managers to select the most appropriate MC by weighing the costs (i.e. non-compliant batches) and the benefits (i.e. reduction in public health risk). Our analyses show that the estimated percentage of batches not complying with the MC is better correlated with the risk estimate than surrogate risk measures like the flock prevalence or the arithmetic mean concentration of bacteria on carcasses, and would therefore be a good measure for the risk of Campylobacter on broiler meat in a particular country. Two uncertain parameters in the model are the ratio of within- and between-flock variances in concentrations, and the transition factor of skin sample concentrations to concentrations on the meat. Sensitivity analyses show that these parameters have a considerable effect on our results, but the impact of their uncertainty is small compared to that of the parameters defining the Microbiological Criterion and the concentration on the meat.
► We evaluate microbiological criteria (MCs) for Campylobacter by risk assessment. ► We compare the impact of MCs in 25 European countries. ► We show that risk based MCs can be derived without consideration of FSO or PO. ► We find that MCs correlate better with risk than prevalence and mean concentration. ► Our model offers a useful practical tool for food safety risk managers.
The presence of Salmonella in poultry litter, when used as a biological soil amendment, presents a risk for the preharvest contamination of fresh produce. Poultry litter is rich in organic nitrogen, ...and previous studies have suggested that ammonia (NH3) in poultry litter may affect the survival of Salmonella. Salmonella enterica serovar Typhimurium was inoculated into buffer solutions to characterize the pH dependency, minimum antimicrobial concentration, and efficacy of NH3 production. In solutions with 0.4 M total ammonia nitrogen (TAN) at various pH levels (5, 7, 8, and 9), significant inactivation of Salmonella only occurred at pH 9. Salmonella was reduced by ∼8 log CFU/mL within 12 to 18 h at 0.09, 0.18, 0.26, and 0.35 M NH3. The minimum antimicrobial concentration tested was 0.04 M NH3, resulting in an ∼7 log CFU/mL reduction after 24 h. Solutions with urea (1% and 2%) and urease enzymes rapidly produced NH3, which significantly reduced Salmonella within 12 h. The urease-producing bacterium Corynebacterium urealyticum showed no antagonistic effects against Salmonella in solution. Conversely, with 1% urea added, C. urealyticum rapidly produced NH3 in solution and significantly reduced Salmonella within 12 h. Salmonella inactivation data were nonlinear and fitted to Weibull models (Weibull, Weibull with tailing effects, and double Weibull) to describe their inactivation kinetics. These results suggest that high NH3 levels in poultry litter may reduce the risk of contamination in this biological soil amendment. This study will guide future research on the influence of ammonia on the survival and persistence of Salmonella in poultry litter. IMPORTANCE Poultry litter is a widely used biological soil amendment in the production of fresh produce. However, poultry litter may contain human pathogens, such as Salmonella, which introduces the risk of preharvest produce contamination in agricultural fields. Ammonia in poultry litter, produced through bacterial degradation of urea, may be detrimental to the survival of Salmonella; however, these effects are not fully understood. This study utilized aqueous buffer solutions to demonstrate that the antimicrobial efficacy of ammonia against Salmonella is dependent on alkaline pH levels, where increasing concentrations of ammonia led to more rapid inactivation. Inactivation was also demonstrated in the presence of urea and urease or urease-producing Corynebacterium urealyticum. These findings suggest that high levels of ammonia in poultry litter may reduce the risk of contamination in biological soil amendments and will guide further studies on the survival and persistence of Salmonella in poultry litter.
To systematically review the methodology of general burden of disease studies. Three key questions were addressed: 1) what was the quality of the data, 2) which methodological choices were made to ...calculate disability adjusted life years (DALYs), and 3) were uncertainty and risk factor analyses performed? Furthermore, DALY outcomes of the included studies were compared.
Burden of disease studies (1990 to 2011) in international peer-reviewed journals and in grey literature were identified with main inclusion criteria being multiple-cause studies that quantified the burden of disease as the sum of the burden of all distinct diseases expressed in DALYs. Electronic database searches included Medline (PubMed), EMBASE, and Web of Science. Studies were collated by study population, design, methods used to measure mortality and morbidity, risk factor analyses, and evaluation of results.
Thirty-one studies met the inclusion criteria of our review. Overall, studies followed the Global Burden of Disease (GBD) approach. However, considerable variation existed in disability weights, discounting, age-weighting, and adjustments for uncertainty. Few studies reported whether mortality data were corrected for missing data or underreporting. Comparison with the GBD DALY outcomes by country revealed that for some studies DALY estimates were of similar magnitude; others reported DALY estimates that were two times higher or lower.
Overcoming "error" variation due to the use of different methodologies and low-quality data is a critical priority for advancing burden of disease studies. This can enlarge the detection of true variation in DALY outcomes between populations or over time.
In 2009, the European Centre for Disease Prevention and Control initiated the 'Burden of Communicable Diseases in Europe (BCoDE)' project to generate evidence-based and comparable burden-of-disease ...estimates of infectious diseases in Europe. The burden-of-disease metric used was the Disability-Adjusted Life Year (DALY), composed of years of life lost due to premature death (YLL) and due to disability (YLD). To better represent infectious diseases, a pathogen-based approach was used linking incident cases to sequelae through outcome trees. Health outcomes were included if an evidence-based causal relationship between infection and outcome was established. Life expectancy and disability weights were taken from the Global Burden of Disease Study and alternative studies. Disease progression parameters were based on literature. Country-specific incidence was based on surveillance data corrected for underestimation. Non-typhoidal Salmonella spp. and Campylobacter spp. were used for illustration. Using the incidence- and pathogen-based DALY approach the total burden for Salmonella spp. and Campylobacter spp. was estimated at 730 DALYs and at 1,780 DALYs per year in the Netherlands (average of 2005-2007). Sequelae accounted for 56% and 82% of the total burden of Salmonella spp. and Campylobacter spp., respectively. The incidence- and pathogen-based DALY methodology allows in the case of infectious diseases a more comprehensive calculation of the disease burden as subsequent sequelae are fully taken into account. Not considering subsequent sequelae would strongly underestimate the burden of infectious diseases. Estimates can be used to support prioritisation and comparison of infectious diseases and other health conditions, both within a country and between countries.
Employees in different types of work may be intentionally or accidentally exposed to biological agents. Improved risk assessment is needed to identify opportunities to prevent work-related infectious ...disease. The objective of the current study was to perform a systematic literature review of work-related infectious disease to assist in the identification of occupational infectious disease risks. A literature search of papers on work-related infectious disease published between 1999 and 2008 yielded 1239 papers of which 242 met the selection criteria and were included in the review. The results of the systematic literature review were arranged in a matrix of occupational groups and exposure pathways. Increased risk from infectious diseases appeared to be concentrated in specific professions. Healthcare workers, workers in contact with animals, laboratory workers and refuse workers seem to have the highest risk of infection by a variety of pathogens. However, pathogens reported to be associated with closely related professions were different, indicating qualitative under-reporting. Arranging the results of this systematic review on work-related infectious diseases in a matrix of occupational groups and exposure pathways allowed the reliable identification of exposure hazards for specific occupational groups beyond currently reported diseases.
Fresh produce that is contaminated with viruses may lead to infection and viral gastroenteritis or hepatitis when consumed raw. It is thus important to reduce virus numbers on these foods. Prevention ...of virus contamination in fresh produce production and processing may be more effective than treatment, as sufficient virus removal or inactivation by post-harvest treatment requires high doses that may adversely affect food quality. To date knowledge of the contribution of various potential contamination routes is lacking. A risk assessment model was developed for human norovirus, hepatitis A virus and human adenovirus in raspberry and salad vegetable supply chains to quantify contributions of potential contamination sources to the contamination of produce at retail. These models were used to estimate public health risks. Model parameterization was based on monitoring data from European supply chains and literature data. No human pathogenic viruses were found in the soft fruit supply chains; human adenovirus (hAdV) was detected, which was additionally monitored as an indicator of fecal pollution to assess the contribution of potential contamination points. Estimated risks per serving of lettuce based on the models were 3×10−4 (6×10−6–5×10−3) for NoV infection and 3×10−8 (7×10−10–3×10−6) for hepatitis A jaundice. The contribution to virus contamination of hand-contact was larger as compared with the contribution of irrigation, the conveyor belt or the water used for produce rinsing. In conclusion, viral contamination in the lettuce and soft fruit supply chains occurred and estimated health risks were generally low. Nevertheless, the 97.5% upper limit for the estimated NoV contamination of lettuce suggested that infection risks up to 50% per serving might occur. Our study suggests that attention to full compliance for hand hygiene will improve fresh produce safety related to virus risks most as compared to the other examined sources, given the monitoring results. This effect will be further aided by compliance with other hygiene and water quality regulations in production and processing facilities.
•Six fresh produce supply chains are modeled to assess consumer health risks.•Exposure to human norovirus and hepatitis A virus on fresh produce is considered.•Parameters estimated by field samples from monitoring, experiments and literature.•Handlers' hands were found to contribute most to produce contamination.•Estimated health risks were generally <10−3 per exposure event.
Microbial contamination of fresh produce (fresh fruits and vegetables) poses serious public health concerns worldwide. This study was conducted as a comprehensive analysis of biological hazards in ...the global fresh produce chain. Data about produce-related outbreaks and illness were collected from the annual reports and databases of foodborne outbreak surveillance systems in different regions and countries from 2010 to 2015. The global patterns of and regional differences in documented outbreaks and cases were analyzed, and produce commodities and pathogens of greatest concern were identified. Data on sporadic illnesses were also collected through a comprehensive literature review of case-control studies. We found 988 produce-related outbreaks (with known agents) and 45,723 cases in all regions and countries. The numbers of produce-related outbreaks per million person-years were approximately 0.76, 0.26, 0.25, 0.13, 0.12, and 0.05 in New Zealand, Australia, the United States, the European Union, Canada, and Japan, respectively. The top three food categories and pathogens contributing to produce-related outbreaks were vegetables and nonfruits (i.e., food other than fruits; 27.0%), unspecified vegetables (12.2%), and vegetable row crops (11.7%) and norovirus (42.4%), Salmonella enterica (19.9%), and Staphylococcus aureus (7.9%), respectively. Produce consumption was identified as a protective factor, a risk factor, and either a protective or risk factor for sporadic illnesses in 11, 5, and 5 studies, respectively, among 21 case-control studies. Risks associated with produce consumption in the United States and the European Union have been linked to various factors such as irrigation water, cross-contamination, storage time and temperature abuse, infected food handlers, and unprocessed contaminated ingredients. The results of the current study indicate the complexity of produce products consumed across the globe and the difficulty in tracing illnesses back to specific food ingredients.
The causes of differences in Campylobacter and Escherichia coli concentrations on broiler chicken carcasses after chilling between slaughterhouses are not fully identified. Therefore, it is a ...challenge for slaughterhouses to comply with Process Hygiene Criteria for broiler meat.
The aim of the study was to identify which processing steps contribute to increases or decreases in Campylobacter and E. coli concentrations within and between two slaughterhouses. Identifying the processing steps with variable performance could explain the differences in bacterial concentrations after chilling between slaughterhouses.
Thermotolerant Campylobacter and E. coli concentrations on carcasses during broiler processing were measured during the summer period in 21 trials after bleeding, scalding, defeathering, evisceration and chilling.
In two slaughterhouses with comparable Campylobacter and E. coli concentrations in the incoming batches (after bleeding), the mean log10 concentrations are found to be significantly different after chilling. Campylobacter concentrations decreased by 1.40 log10 in Slaughterhouse 1 and by 1.86 log10 in Slaughterhouse 2, whereas E. coli decreased by 2.19 log10 in Slaughterhouse 1 and by 2.84 log10 in Slaughterhouse 2. Higher concentrations of Campylobacter and E. coli on carcasses after chilling were observed in Slaughterhouse 1 in which an increase in concentrations was observed after evisceration. The effect of processing on Campylobacter and E. coli concentrations in Slaughterhouse 1 did not differ between batches. In Slaughterhouse 2, the effect of processing on the concentrations of both bacteria varied over batches. Changes in E. coli concentration levels during processing were similar to Campylobacter except for defeathering. E. coli concentration significantly decreased after defeathering in both slaughterhouses, whereas Campylobacter increased in Slaughterhouse 2 and in Slaughterhouse 1 no significant changes were observed.
The patterns of increases and decreases in bacterial concentrations during processing are specific for each slaughterhouse. Inhomogeneous patterns potentially explain the differences in concentrations after chilling between slaughterhouses. Critical processing steps should be validated in each slaughterhouse by longitudinal studies and potentially based on E. coli. E. coli has a potential to be used as an indicator of processing hygiene, because the impact of most of the studied processing steps was similar as for Campylobacter.
•Changes in concentrations during processing were specific for each slaughterhouse.•Increase in Campylobacter after defeathering and evisceration was not always observed.•Effect of processing between batches differed in the slaughterhouses evaluated.•Processing impacted Campylobacter and E. coli similarly except for defeathering.•E. coli concentrations decreased after defeathering opposite to Campylobacter.
The disability-adjusted life year (DALY) is widely used to assess the burden of different health problems and risk factors. The disability weight, a value anchored between 0 (perfect health) and 1 ...(equivalent to death), is necessary to estimate the disability component (years lived with disability, YLDs) of the DALY. After publication of the ground-breaking Global Burden of Disease (GBD) 1996, alternative sets of disability weights have been developed over the past 16 years, each using different approaches with regards to the panel, health state description, and valuation methods. The objective of this study was to review all studies that developed disability weights and to critically assess the methodological design choices (health state and time description, panel composition, and valuation method). Furthermore, disability weights of eight specific conditions were compared.
Disability weights studies (1990¿2012) in international peer-reviewed journals and grey literature were identified with main inclusion criteria being that the study assessed DALY disability weights for several conditions or a specific group of illnesses. Studies were collated by design and methods and evaluation of results.
Twenty-two studies met the inclusion criteria of our review. There is considerable variation in methods used to derive disability weights, although most studies used a disease-specific description of the health state, a panel that consisted of medical experts, and nonpreference-based valuation method to assess the values for the majority of the disability weights. Comparisons of disability weights across 15 specific disease and injury groups showed that the subdivision of a disease into separate health states (stages) differed markedly across studies. Additionally, weights for similar health states differed, particularly in the case of mild diseases, for which the disability weight differed by a factor of two or more.
In terms of comparability of the resulting YLDs, the global use of the same set of disability weights has advantages, though practical constraints and intercultural differences should be taken into account into such a set.