Toxoplasma gondii is one of the most widespread parasites in humans and can cause severe illness in immunocompromised individuals. However, its role in healthy people is probably under-appreciated. ...The complex epidemiology of this protozoan recognizes several infection routes but consumption of contaminated food is likely to be the predominant one. Among food, consumption of raw and undercooked meat is a relevant route of transmission, but the role of different meat producing animal species and meats thereof is controversial.
The aim of the present work is to summarize and analyse literature data reporting prevalence estimates of T. gondii in meat animals/meats.
We searched Medline, Web of Science, Science Direct (last update 31/03/2015).
Relevant papers should report data from primary studies dealing with the prevalence of T. gondii in meat from livestock species as obtained through direct detection methods. Meta-analysis and meta-regression were performed.
Of 1915 papers screened, 69 papers were included, dealing mainly with cattle, pigs and sheep. Pooled prevalences, based on random-effect models, were 2.6% (CI95 0.5-5.8) for cattle, 12.3% (CI95 7.6-17.8) for pigs and 14.7% (CI95 8.9-21.5) for sheep. Due to the high heterogeneity observed, univariable and multivariable meta-regression models were fitted showing that the geographic area for cattle (p = 0.032), the farming type for pigs (p = 0.0004) and the sample composition for sheep (p = 0.03) had significant effects on the prevalences of Toxoplasma detected/estimated. Moreover, the role of different animal species was dependent on the geographic location of animals' origin.
Limitations were due mainly to a possible publication bias.
The present work confirms the role of meat, including beef, as T. gondii sources, and highlights the need for a control system for this parasite to be implemented along the meat production chain. Moreover, consumer knowledge should be strengthened in order to reduce the impact of disease.
Increasing world population worsens the serious problem of food security in developing countries. On the other hand in industrialized countries, where the problem of food security is of minor ...concern, health problems related to food refer to 2 main factors: food safety and environmental sustainability of food production. For these reasons, new ways must be found to increase yields while preserving food quality, natural habitats, and biodiversity. Insects could be of great interest as a possible solution due to their capability to satisfy 2 different requirements: (i) they are an important source of protein and other nutrients; (ii) their use as food has ecological advantages over conventional meat and, in the long run, economic benefits. However, little is known on the food safety side and this can be of critical importance to meet society's approval, especially if people are not accustomed to eating insects. This paper aims to collect information in order to evaluate how insects could be safely used as food and to discuss nutritional data to justify why insect food sources can no longer be neglected. Legislative issues will also be discussed.
In the last few years, 16S rRNA gene sequencing (16S rDNA-seq) has seen a surprisingly rapid increase in election rate as a methodology to perform microbial community studies. Despite the ...considerable popularity of this technique, an exiguous number of specific tools are currently available for proper 16S rDNA-seq count data preprocessing and simulation. Indeed, the great majority of tools have been developed adapting methodologies previously used for bulk RNA-seq data, with poor assessment of their applicability in the metagenomics field. For such tools and the few ones specifically developed for 16S rDNA-seq data, performance assessment is challenging, mainly due to the complex nature of the data and the lack of realistic simulation models. In fact, to the best of our knowledge, no software thought for data simulation are available to directly obtain synthetic 16S rDNA-seq count tables that properly model heavy sparsity and compositionality typical of these data.
In this paper we present metaSPARSim, a sparse count matrix simulator intended for usage in development of 16S rDNA-seq metagenomic data processing pipelines. metaSPARSim implements a new generative process that models the sequencing process with a Multivariate Hypergeometric distribution in order to realistically simulate 16S rDNA-seq count table, resembling real experimental data compositionality and sparsity. It provides ready-to-use count matrices and comes with the possibility to reproduce different pre-coded scenarios and to estimate simulation parameters from real experimental data. The tool is made available at http://sysbiobig.dei.unipd.it/?q=Software#metaSPARSimand https://gitlab.com/sysbiobig/metasparsim.
metaSPARSim is able to generate count matrices resembling real 16S rDNA-seq data. The availability of count data simulators is extremely valuable both for methods developers, for which a ground truth for tools validation is needed, and for users who want to assess state of the art analysis tools for choosing the most accurate one. Thus, we believe that metaSPARSim is a valuable tool for researchers involved in developing, testing and using robust and reliable data analysis methods in the context of 16S rRNA gene sequencing.
Obesity in dogs is an emerging issue that affects canine health and well-being. Its development is ascribed to several factors, including genetic predisposition and dietary management, and recent ...evidence suggests that intestinal microbiota may be involved as well. Previous works have shown obesity to be linked to significant changes in gut microbiota composition in humans and mice, but only limited information is available on the role played by canine gut microbiota. The aim of this exploratory study was to investigate whether composition of canine faecal microbiota may be influenced by overweight condition and breed. All the enrolled companion dogs were young adults, intact, healthy, and fed commercial extruded pet food; none had received antibiotics, probiotics or immunosuppressant drugs in the previous six months. Labrador Retriever (LR) and Border Collie (BC) were chosen as reference breeds and Body Condition Score (BCS) on a 9-point scale as reference method for evaluating body fat. The faecal microbial communities of 15 lean (BCS 4-5/9; 7 LRs and 8 BCs) and 14 overweight (BCS > 5/9; 8 LRs and 6 BCs) family dogs were analysed using 16S rRNA gene sequencing. Moreover, for each dog, the daily intake of energy (kcal/d) and dietary macronutrients (g/d) were calculated according to an accurate feeding history collection. Firmicutes and Bacteroidetes resulted the predominant phyla (51.5 ± 10.0% and 33.4 ± 8.5%, respectively) in all dogs. Bioinformatic and statistical analysis revealed that no bacterial taxon differed significantly based on body condition, except for genus Allisonella (p < 0.05); BC gut microbiota was richer (p < 0.05) in bacteria belonging to phyla Actinobacteria (family Coriobacteriaceae in particular) and Firmicutes (Allobaculum and Roseburia genera). No remarkable differences were recorded either for diversity indices (i.e., alpha diversity, p > 0.10) or for divergence within the sample set (i.e., beta diversity, p > 0.05). PERMANOVA tests performed on single factors demonstrated the tendency of dietary protein to influence the recruited dogs' microbiota beta-diversity at amplicon sequence variant level (p = 0.08). In conclusion, the faecal microbiota of dogs involved in this exploratory study showed no major variations based on body condition. However, our findings suggested that certain bacterial taxa previously acknowledged in obesity-related studies may be detected in dissimilar amounts depending on canine breed.
Commercial poultry farms (n° 523), located in all the six regions of Nigeria were sampled with a view to generate baseline information about the distribution of Salmonella serovars in this country. ...Five different matrices (litter, dust, faeces, feed and water) were collected from each visited farm. Salmonella was isolated from at least one of the five matrices in 228 farms, with a farm prevalence of 43.6% (CI9539.7-48.3%). Altogether, 370 of 2615 samples collected (14.1%, CI9512.8; 15.5%) contained Salmonella. Considering the number of positive farms and the number of positive samples, it was evident that for the majority of the sampled farms, few samples were positive for Salmonella. With regard to the matrices, there was no difference in Salmonella prevalence among the five matrices considered. Of the 370 isolates serotyped, eighty-two different serotypes were identified and Salmonella Kentucky was identified as having the highest isolation rate in all the matrices sampled (16.2%), followed by S. Poona and S. Elisabethville. S. Kentucky was distributed across the country, whereas the other less frequent serovars had a more circumscribed diffusion. This is one of few comprehensive studies on the occurrence and distribution of Salmonella in commercial chicken layer farms from all the six regions of Nigeria. The relatively high prevalence rate documented in this study may be attributed to the generally poor infrastructure and low biosecurity measures in controlling stray animals, rodents and humans. Data collected could be valuable for instituting effective intervention strategies for Salmonella control in Nigeria and also in other developing countries with a similar poultry industry structure, with the final aim of reducing Salmonella spread in animals and ultimately in humans.
European legislation has defined as process hygiene criteria for the main livestock species (cattle, sheep, goats, horses and pigs) the monitoring of aerobic colony count and Enterobacteriaceae. ...Detected values above the defined criteria require an improvement in slaughter hygiene and the review of process control. The main source of microbiological contamination of beef carcasses along the slaughterline is of fecal origin, therefore Escherichia coli and Enterobacteriaceae seem to be the most suitable indicators to assess the hygienic status of the slaughter process. Although microbiological criteria addressing indicator bacteria have been in place in industrialized countries for several years, scattered information still exists on factors affecting their counts on beef carcasses along the slaughterline. Therefore, a systematic literature review, covering the period 2000–2012, was conducted to gather information concerning: 1) counts of E. coli and Enterobacteriaceae on beef carcasses linked to different stages of the slaughterline; 2) factors influencing presence/counts of E. coli and Enterobacteriaceae on beef carcasses; and 3) the relationship between indicator bacteria (E. coli and Enterobacteriaceae) counts and visual fecal contamination of beef carcasses. According to the 41 retrieved papers the following conclusions were drawn. A decrease of the indicator bacteria counts was recorded after sequential decontamination treatments, such as pasteurization and hot water washing. Slaughterhouse characteristics influenced bacterial load of beef carcasses, although it was difficult to assess which factors (i.e., slaughterhouse throughput, design of the plant, surveillance system in place) had the greatest effect. Finally, carcasses from fecal contaminated animals had higher bacterial loads than those from clean animals. Therefore, the development of a visual classification system of the level of dirtiness of carcasses and the application of effective treatments on the carcasses classified as dirty along the slaughterline can lead to a contamination level for these carcasses comparable to or lower than that of originally clean ones at the end of the slaughterline.
•Beef slaughterline is recognized to lack stages able to reduce bacterial counts.•Pasteurization and hot water washing lead to a reduction of beef carcass contamination.•Carcasses from faecal contaminated animals have higher bacterial loads than those from clean animals.•Treatments on carcasses visually classified as dirty lead to a contamination comparable to those of clean carcasses.
Background
In Nigeria, there have been reports of widespread multiple antimicrobial resistance (AMR) amongst
Salmonella
isolated from poultry. To mitigate the impact of mortality associated with
...Salmonella
on their farms, farmers resort to the use of antimicrobials without sound diagnostic advice. We conducted this study to describe the AMR patterns, mechanisms and genetic similarities within some
Salmonella
serovars isolated from different layer farms.
Method
We determine the AMR profiles of two hundred
Salmonella
isolates, selected based on frequency, serovar, and geographical and sample type distribution. We also assessed the mechanisms of multi-drug resistance for specific genetic determinants by using PCR protocols and gene sequence analysis. Pulsed-field gel electrophoresis (PFGE) was conducted on seven selected serovars to determine their genetic relatedness.
Results
Of 200 isolates, 97 (48.5%) revealed various AMR profiles, with the multiple antibiotic resistance (MAR) index ranging from 0.07–0.5. Resistance to ciprofloxacin was common in all the multi-drug resistant isolates, while all the isolates were susceptible to cefotaxime, ceftazidime, and meropenem. Genotypic characterization showed the presence of resistance genes as well as mutations in the nucleotide genes with subsequent amino acid substitutions. Fifteen isolates (43%) of
S
. Kentucky were indistinguishable, but were isolated from four different states in Nigeria (Ogun, n = 9; Kaduna, n = 6; Plateau, n = 3, and: Bauchi, n = 2). PFGE revealed 40 pulsotype patterns (Kentucky, n = 12; Larochelle, n = 9; Virchow, n = 5; Saintpaul, n = 4; Poona, n = 3; Isangi, n = 2, and; Nigeria, n = 2).
Conclusion
This study recorded strictly related but diversely distributed
Salmonella
serovars with high AMR rates in poultry. We recommend strict regulation on antimicrobial use and regular monitoring of AMR trends among bacteria isolated from animals and humans to inform public policy.
Food safety criteria for Listeria monocytogenes in ready‐to‐eat (RTE) foods have been applied from 2006 onwards (Commission Regulation (EC) 2073/2005). Still, human invasive listeriosis was reported ...to increase over the period 2009–2013 in the European Union and European Economic Area (EU/EEA). Time series analysis for the 2008–2015 period in the EU/EEA indicated an increasing trend of the monthly notified incidence rate of confirmed human invasive listeriosis of the over 75 age groups and female age group between 25 and 44 years old (probably related to pregnancies). A conceptual model was used to identify factors in the food chain as potential drivers for L. monocytogenes contamination of RTE foods and listeriosis. Factors were related to the host (i. population size of the elderly and/or susceptible people; ii. underlying condition rate), the food (iii. L. monocytogenes prevalence in RTE food at retail; iv. L. monocytogenes concentration in RTE food at retail; v. storage conditions after retail; vi. consumption), the national surveillance systems (vii. improved surveillance), and/or the bacterium (viii. virulence). Factors considered likely to be responsible for the increasing trend in cases are the increased population size of the elderly and susceptible population except for the 25–44 female age group. For the increased incidence rates and cases, the likely factor is the increased proportion of susceptible persons in the age groups over 45 years old for both genders. Quantitative modelling suggests that more than 90% of invasive listeriosis is caused by ingestion of RTE food containing > 2,000 colony forming units (CFU)/g, and that one‐third of cases are due to growth in the consumer phase. Awareness should be increased among stakeholders, especially in relation to susceptible risk groups. Innovative methodologies including whole genome sequencing (WGS) for strain identification and monitoring of trends are recommended.
This publication is linked to the following EFSA Supporting Publications article: http://onlinelibrary.wiley.com/doi/10.2903/sp.efsa.2018.EN-1352/full
Diet and lifestyle have a strong influence on gut microbiota, which in turn has important implications on a variety of health-related aspects. Despite great advances in the field, it remains unclear ...to which extent the composition of the gut microbiota is modulated by the intake of animal derived products, compared to a vegetable based diet. Here the specific impact of vegan, vegetarian, and omnivore feeding type on the composition of gut microbiota of 101 adults was investigated among groups homogeneous for variables known to have a role in modulating gut microbial composition such as age, anthropometric variables, ethnicity, and geographic area. The results displayed a picture where the three different dietetic profiles could be well distinguished on the basis of participant's dietetic regimen. Regarding the gut microbiota; vegetarians had a significantly greater richness compared to omnivorous. Moreover, counts of Bacteroidetes related operational taxonomic units (OTUs) were greater in vegans and vegetarians compared to omnivores. Interestingly considering the whole bacterial community composition the three cohorts were unexpectedly similar, which is probably due to their common intake in terms of nutrients rather than food, e.g., high fat content and reduced protein and carbohydrate intake. This finding suggests that fundamental nutritional choices such as vegan, vegetarian, or omnivore do influence the microbiota but do not allow to infer conclusions on gut microbial composition, and suggested the possibility for a preferential impact of other variables, probably related to the general life style on shaping human gut microbial community in spite of dietary influence. Consequently, research were individuals are categorized on the basis of their claimed feeding types is of limited use for scientific studies, since it appears to be oversimplified.
The Scientific Committee (SC) reconfirms that the benchmark dose (BMD) approach is a scientifically more advanced method compared to the NOAEL approach for deriving a Reference Point (RP). Most of ...the modifications made to the SC guidance of 2009 concern the section providing guidance on how to apply the BMD approach. Model averaging is recommended as the preferred method for calculating the BMD confidence interval, while acknowledging that the respective tools are still under development and may not be easily accessible to all. Therefore, selecting or rejecting models is still considered as a suboptimal alternative. The set of default models to be used for BMD analysis has been reviewed, and the Akaike information criterion (AIC) has been introduced instead of the log‐likelihood to characterise the goodness of fit of different mathematical models to a dose–response data set. A flowchart has also been inserted in this update to guide the reader step‐by‐step when performing a BMD analysis, as well as a chapter on the distributional part of dose–response models and a template for reporting a BMD analysis in a complete and transparent manner. Finally, it is recommended to always report the BMD confidence interval rather than the value of the BMD. The lower bound (BMDL) is needed as a potential RP, and the upper bound (BMDU) is needed for establishing the BMDU/BMDL per ratio reflecting the uncertainty in the BMD estimate. This updated guidance does not call for a general re‐evaluation of previous assessments where the NOAEL approach or the BMD approach as described in the 2009 SC guidance was used, in particular when the exposure is clearly smaller (e.g. more than one order of magnitude) than the health‐based guidance value. Finally, the SC firmly reiterates to reconsider test guidelines given the expected wide application of the BMD approach.
http://onlinelibrary.wiley.com/doi/10.2903/sp.efsa.2017.EN-1147/full