Enteric viruses represent a global public health threat and are implicated in numerous foodborne and waterborne disease outbreaks. Nonetheless, relatively little is known of their fate and stability ...in the environment. In this study we used carefully validated methods to monitor enteric viruses, namely adenovirus (AdV), JC polyomavirus (JCV), noroviruses (NoVs), sapovirus (SaV) and hepatitis A and E viruses (HAV and HEV) from wastewater source to beaches and shellfish beds. Wastewater influent and effluent, surface water, sediment and shellfish samples were collected in the Conwy catchment (North Wales, UK) once a month for one year. High concentrations of AdV and JCV were found in the majority of samples, and no seasonal patterns were observed. No HAV and HEV were detected and no related illnesses were reported in the area during the period of sampling. Noroviruses and SaV were also detected at high concentrations in wastewater and surface water, and their presence correlated with local gastroenteritis outbreaks during the spring and autumn seasons. Noroviruses were also found in estuarine sediment and in shellfish harvested for human consumption. As PCR-based methods were used for quantification, viral infectivity and degradation was estimated using a NoV capsid integrity assay. The assay revealed low-levels of viral decay in wastewater effluent compared to influent, and more significant decay in environmental waters and sediment. Results suggest that AdV and JCV may be suitable markers for the assessment of the spatial distribution of wastewater contamination in the environment; and pathogenic viruses can be directly monitored during and after reported outbreaks to prevent further environment-derived illnesses.
Display omitted
•The TFUF concentration method is suitable for quantifying viruses in water samples.•For the first time, sapovirus was found in UK waters.•Enteric viruses were traceable from source to beaches and shellfish beds.•Norovirus concentrations in the environment agreed with local outbreaks.•PGM assay is useful to study norovirus degradation in the environment.
The recent detection of SARS-CoV-2 RNA in feces has led to speculation that it can be transmitted via the fecal-oral/ocular route. This review aims to critically evaluate the incidence of ...gastrointestinal (GI) symptoms, the quantity and infectivity of SARS-CoV-2 in feces and urine, and whether these pose an infection risk in sanitary settings, sewage networks, wastewater treatment plants, and the wider environment (e.g. rivers, lakes and marine waters). A review of 48 independent studies revealed that severe GI dysfunction is only evident in a small number of COVID-19 cases, with 11 ± 2% exhibiting diarrhea and 12 ± 3% exhibiting vomiting and nausea. In addition to these cases, SARS-CoV-2 RNA can be detected in feces from some asymptomatic, mildly- and pre-symptomatic individuals. Fecal shedding of the virus peaks in the symptomatic period and can persist for several weeks, but with declining abundances in the post-symptomatic phase. SARS-CoV-2 RNA is occasionally detected in urine, but reports in fecal samples are more frequent. The abundance of the virus genetic material in both urine (ca. 102–105 gc/ml) and feces (ca. 102–107 gc/ml) is much lower than in nasopharyngeal fluids (ca. 105–1011 gc/ml). There is strong evidence of multiplication of SARS-CoV-2 in the gut and infectious virus has occasionally been recovered from both urine and stool samples. The level and infectious capability of SARS-CoV-2 in vomit remain unknown. In comparison to enteric viruses transmitted via the fecal-oral route (e.g. norovirus, adenovirus), the likelihood of SARS-CoV-2 being transmitted via feces or urine appears much lower due to the lower relative amounts of virus present in feces/urine. The biggest risk of transmission will occur in clinical and care home settings where secondary handling of people and urine/fecal matter occurs. In addition, while SARS-CoV-2 RNA genetic material can be detected by in wastewater, this signal is greatly reduced by conventional treatment. Our analysis also suggests the likelihood of infection due to contact with sewage-contaminated water (e.g. swimming, surfing, angling) or food (e.g. salads, shellfish) is extremely low or negligible based on very low predicted abundances and limited environmental survival of SARS-CoV-2. These conclusions are corroborated by the fact that tens of million cases of COVID-19 have occurred globally, but exposure to feces or wastewater has never been implicated as a transmission vector.
Display omitted
•SARS-CoV-2 RNA can be readily detected in feces and occasionally urine.•Severe GI dysfunction only occurs in a small number of cases (11 ± 2%).•Likelihood of SARS-CoV-2 being transmitted via feces appears very low.•Likelihood of infection from sewage-contaminated water or food is extremely low.
Summary Background Palliative oxygen therapy is widely used for treatment of dyspnoea in individuals with life-limiting illness who are ineligible for long-term oxygen therapy. We assessed the ...effectiveness of oxygen compared with room air delivered by nasal cannula for relief of breathlessness in this population of patients. Methods Adults from outpatient clinics at nine sites in Australia, the USA, and the UK were eligible for enrolment in this double-blind, randomised controlled trial if they had life-limiting illness, refractory dyspnoea, and partial pressure of oxygen in arterial blood (PaO2 ) more than 7·3 kPa. Participants were randomly assigned in a 1:1 ratio by a central computer-generated system to receive oxygen or room air via a concentrator through a nasal cannula at 2 L per min for 7 days. Participants were instructed to use the concentrator for at least 15 h per day. The randomisation sequence was stratified by baseline PaO2 with balanced blocks of four patients. The primary outcome measure was breathlessness (0–10 numerical rating scale NRS), measured twice a day (morning and evening). All randomised patients who completed an assessment were included in the primary analysis for that data point (no data were imputed). This study is registered, numbers NCT00327873 and ISRCTN67448752. Findings 239 participants were randomly assigned to treatment (oxygen, n=120; room air, n=119). 112 (93%) patients assigned to receive oxygen and 99 (83%) assigned to receive room air completed all 7 days of assessments. From baseline to day 6, mean morning breathlessness changed by −0·9 points (95% CI −1·3 to −0·5) in patients assigned to receive oxygen and by −0·7 points (−1·2 to −0·2) in patients assigned to receive room air (p=0·504). Mean evening breathlessness changed by −0·3 points (−0·7 to 0·1) in the oxygen group and by −0·5 (−0·9 to −0·1) in the room air group (p=0·554). The frequency of side-effects did not differ between groups. Extreme drowsiness was reported by 12 (10%) of 116 patients assigned to receive oxygen compared with 14 (13%) of 108 patients assigned to receive room air. Two (2%) patients in the oxygen group reported extreme symptoms of nasal irritation compared with seven (6%) in the room air group. One patient reported an extremely troublesome nose bleed (oxygen group). Interpretation Since oxygen delivered by a nasal cannula provides no additional symptomatic benefit for relief of refractory dyspnoea in patients with life-limiting illness compared with room air, less burdensome strategies should be considered after brief assessment of the effect of oxygen therapy on the individual patient. Funding US National Institutes of Health, Australian National Health and Medical Research Council, Duke Institute for Care at the End of Life, and Doris Duke Charitable Foundation.
The formation of mammalian dendritic cells (DCs) is controlled by multiple hematopoietic transcription factors, including IRF8. Loss of IRF8 exerts a differential effect on DC subsets, including ...plasmacytoid DCs (pDCs) and the classical DC lineages cDC1 and cDC2. In humans, cDC2-related subsets have been described including AXL+SIGLEC6+ pre-DC, DC2 and DC3. The origin of this heterogeneity is unknown. Using high-dimensional analysis, in vitro differentiation, and an allelic series of human IRF8 deficiency, we demonstrated that cDC2 (CD1c+DC) heterogeneity originates from two distinct pathways of development. The lymphoid-primed IRF8hi pathway, marked by CD123 and BTLA, carried pDC, cDC1, and DC2 trajectories, while the common myeloid IRF8lo pathway, expressing SIRPA, formed DC3s and monocytes. We traced distinct trajectories through the granulocyte-macrophage progenitor (GMP) compartment showing that AXL+SIGLEC6+ pre-DCs mapped exclusively to the DC2 pathway. In keeping with their lower requirement for IRF8, DC3s expand to replace DC2s in human partial IRF8 deficiency.
Display omitted
•Distinct development trajectories of DC2 and DC3 underpin human cDC2 heterogeneity•pDC, cDC1, and DC2 (classical DCs) develop from LMPPs along a CD123+ IRF8high pathway•DC3 and monocytes develop from CD33+ GMPs along an IRF8low SIRPA+ pathway•IRF8 deficiency causes gene dose-dependent loss of IRF8high then IRF8low pathway DCs
Heterogeneity of human CD1c+ dendritic cells (cDC2s) is described, but how this arises is unknown. Cytlak and colleagues demonstrate that the cDC2 subsets, DC2 and DC3, develop along distinct hematopoietic trajectories, defined by differential IRF8 expression. DC2s develop from LMPPs along an IRF8hi pathway, while DC3 differentiation follows an IRF8low trajectory.
Waterborne enteric viruses are an emerging cause of disease outbreaks and represent a major threat to global public health. Enteric viruses may originate from human wastewater and can undergo rapid ...transport through aquatic environments with minimal decay. Surveillance and source apportionment of enteric viruses in environmental waters is therefore essential for accurate risk management. However, individual monitoring of the >100 enteric viral strains that have been identified as aquatic contaminants is unfeasible. Instead, viral indicators are often used for quantitative assessments of wastewater contamination, viral decay and transport in water. An ideal indicator for tracking wastewater contamination should be (i) easy to detect and quantify, (ii) source-specific, (iii) resistant to wastewater treatment processes, and (iv) persistent in the aquatic environment, with similar behaviour to viral pathogens. Here, we conducted a comprehensive review of 127 peer-reviewed publications, to critically evaluate the effectiveness of several viral indicators of wastewater pollution, including common enteric viruses (mastadenoviruses, polyomaviruses, and Aichi viruses), the pepper mild mottle virus (PMMoV), and gut-associated bacteriophages (Type II/III FRNA phages and phages infecting human Bacteroides species, including crAssphage). Our analysis suggests that overall, human mastadenoviruses have the greatest potential to indicate contamination by domestic wastewater due to their easy detection, culturability, and high prevalence in wastewater and in the polluted environment. Aichi virus, crAssphage and PMMoV are also widely detected in wastewater and in the environment, and may be used as molecular markers for human-derived contamination. We conclude that viral indicators are suitable for the long-term monitoring of viral contamination in freshwater and marine environments and that these should be implemented within monitoring programmes to provide a holistic assessment of microbiological water quality and wastewater-based epidemiology, improve current risk management strategies and protect global human health.
Display omitted
•Human mastadenoviruses are robust indicators for human-associated pollution in water.•Bacteroides-associated phages and crAssphage are promising indicators.•Multiple indicators should be used to assess wastewater treatment efficiency.•Survival and abundance of indicator viruses should be further assessed.
IBD confers an increased lifetime risk of developing colorectal cancer (CRC), and colitis-associated CRC (CA-CRC) is molecularly distinct from sporadic CRC (S-CRC). Here we have dissected the ...evolutionary history of CA-CRC using multiregion sequencing.
Exome sequencing was performed on fresh-frozen multiple regions of carcinoma, adjacent non-cancerous mucosa and blood from 12 patients with CA-CRC (n=55 exomes), and key variants were validated with orthogonal methods. Genome-wide copy number profiling was performed using single nucleotide polymorphism arrays and low-pass whole genome sequencing on archival non-dysplastic mucosa (n=9), low-grade dysplasia (LGD; n=30), high-grade dysplasia (HGD; n=13), mixed LGD/HGD (n=7) and CA-CRC (n=19). Phylogenetic trees were reconstructed, and evolutionary analysis used to reveal the temporal sequence of events leading to CA-CRC.
10/12 tumours were microsatellite stable with a median mutation burden of 3.0 single nucleotide alterations (SNA) per Mb, ~20% higher than S-CRC (2.5 SNAs/Mb), and consistent with elevated ageing-associated mutational processes. Non-dysplastic mucosa had considerable mutation burden (median 47 SNAs), including mutations shared with the neighbouring CA-CRC, indicating a precancer mutational field. CA-CRCs were often near triploid (40%) or near tetraploid (20%) and phylogenetic analysis revealed that copy number alterations (CNAs) began to accrue in non-dysplastic bowel, but the LGD/HGD transition often involved a punctuated 'catastrophic' CNA increase.
Evolutionary genomic analysis revealed precancer clones bearing extensive SNAs and CNAs, with progression to cancer involving a dramatic accrual of CNAs at HGD. Detection of the cancerised field is an encouraging prospect for surveillance, but punctuated evolution may limit the window for early detection.
Forest declines caused by climate disturbance, insect pests and microbial pathogens threaten the global landscape, and tree diseases are increasingly attributed to the emergent properties of complex ...ecological interactions between the host, microbiota and insects. To address this hypothesis, we combined reductionist approaches (single and polyspecies bacterial cultures) with emergentist approaches (bacterial inoculations in an oak infection model with the addition of insect larvae) to unravel the gene expression landscape and symptom severity of host–microbiota–insect interactions in the acute oak decline (AOD) pathosystem. AOD is a complex decline disease characterized by predisposing abiotic factors, inner bark lesions driven by a bacterial pathobiome, and larval galleries of the bark-boring beetle
Agrilus biguttatus
. We identified expression of key pathogenicity genes in
Brenneria goodwinii
, the dominant member of the AOD pathobiome, tissue-specific gene expression profiles, cooperation with other bacterial pathobiome members in sugar catabolism, and demonstrated amplification of pathogenic gene expression in the presence of
Agrilus
larvae. This study highlights the emergent properties of complex host–pathobiota–insect interactions that underlie the pathology of diseases that threaten global forest biomes.
Protected areas vary enormously in their contribution to conserving biodiversity, and the inefficiency of protected area systems is widely acknowledged. However, conservation plans focus ...overwhelmingly on adding new sites to current protected area estates. Here we show that the conservation performance of a protected area system can be radically improved, without extra expenditure, by replacing a small number of protected areas with new ones that achieve more for conservation. Replacing the least cost-effective 1% of Australia's 6,990 strictly protected areas could increase the number of vegetation types that have 15% or more of their original extent protected from 18 to 54, of a maximum possible of 58. Moreover, it increases markedly the area that can be protected, with no increase in overall spending. This new paradigm for protected area system expansion could yield huge improvements to global conservation at a time when competition for land is increasingly intense.
Celotno besedilo
Dostopno za:
DOBA, IJS, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Faecal contamination of estuarine and coastal waters can pose a risk to human health, particularly in areas used for shellfish production or recreation. Routine microbiological water quality testing ...highlights areas of faecal indicator bacteria (FIB) contamination within the water column, but fails to consider the abundance of FIB in sediments, which under certain hydrodynamic conditions can become resuspended. Sediments can enhance the survival of FIB in estuarine environments, but the influence of sediment composition on the ecology and abundance of FIB is poorly understood. To determine the relationship between sediment composition (grain size and organic matter) and the abundance of pathogen indicator bacteria (PIB), sediments were collected from four transverse transects of the Conwy estuary, UK. The abundance of culturable Escherichia coli, total coliforms, enterococci, Campylobacter, Salmonella and Vibrio spp. in sediments was determined in relation to sediment grain size, organic matter content, salinity, depth and temperature. Sediments that contained higher proportions of silt and/or clay and associated organic matter content showed significant positive correlations with the abundance of PIB. Furthermore, the abundance of each bacterial group was positively correlated with the presence of all other groups enumerated. Campylobacter spp. were not isolated from estuarine sediments. Comparisons of the number of culturable E. coli, total coliforms and Vibrio spp. in sediments and the water column revealed that their abundance was 281, 433 and 58-fold greater in sediments (colony forming units (CFU)/100g) when compared with the water column (CFU/100ml), respectively. These data provide important insights into sediment compositions that promote the abundance of PIB in estuarine environments, with important implications for the modelling and prediction of public health risk based on sediment resuspension and transport.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
The long term survival of fecal indicator organisms (FIOs) and human pathogenic microorganisms in sediments is important from a water quality, human health and ecological perspective. Typically, both ...bacteria and viruses strongly associate with particulate matter present in freshwater, estuarine and marine environments. This association tends to be stronger in finer textured sediments and is strongly influenced by the type and quantity of clay minerals and organic matter present. Binding to particle surfaces promotes the persistence of bacteria in the environment by offering physical and chemical protection from biotic and abiotic stresses. How bacterial and viral viability and pathogenicity is influenced by surface attachment requires further study. Typically, long-term association with surfaces including sediments induces bacteria to enter a viable-but-non-culturable (VBNC) state. Inherent methodological challenges of quantifying VBNC bacteria may lead to the frequent under-reporting of their abundance in sediments. The implications of this in a quantitative risk assessment context remain unclear. Similarly, sediments can harbor significant amounts of enteric viruses, however, the factors regulating their persistence remains poorly understood. Quantification of viruses in sediment remains problematic due to our poor ability to recover intact viral particles from sediment surfaces (typically <10%), our inability to distinguish between infective and damaged (non-infective) viral particles, aggregation of viral particles, and inhibition during qPCR. This suggests that the true viral titre in sediments may be being vastly underestimated. In turn, this is limiting our ability to understand the fate and transport of viruses in sediments. Model systems (e.g., human cell culture) are also lacking for some key viruses, preventing our ability to evaluate the infectivity of viruses recovered from sediments (e.g., norovirus). The release of particle-bound bacteria and viruses into the water column during sediment resuspension also represents a risk to water quality. In conclusion, our poor process level understanding of viral/bacterial-sediment interactions combined with methodological challenges is limiting the accurate source apportionment and quantitative microbial risk assessment for pathogenic organisms associated with sediments in aquatic environments.