•Systematic review and quantitative reappraisal of 10 years’ of experimental mixture studies.•Inventory of 1220 mixture experiments subjected to subgroup analyses.•Quantitative reappraisal of 557 ...claimed deviations from expected additivity, classed by their authors as synergisms, antagonism, interactions or potentiations.•Few claims of synergistic or antagonistic effects exceeded the boundaries of acceptable between-study variability.•Results confirm the utility of default application of the dose (concentration) addition concept for predictive assessments of simultaneous exposures to multiple chemicals.•Application of dose addition must be complemented by an awareness of the synergistic potential of specific classes of chemicals.
Several reviews of synergisms and antagonisms in chemical mixtures have concluded that synergisms are relatively rare. However, these reviews focused on mixtures composed of specific groups of chemicals, such as pesticides or metals and on toxicity endpoints mostly relevant to ecotoxicology. Doubts remain whether these findings can be generalised. A systematic review not restricted to specific chemical mixtures and including mammalian and human toxicity endpoints is missing.
We conducted a systematic review and quantitative reappraisal of 10 years’ of experimental mixture studies to investigate the frequency and reliability of evaluations of mixture effects as synergistic or antagonistic. Unlike previous reviews, we did not limit our efforts to certain groups of chemicals or specific toxicity outcomes and covered mixture studies relevant to ecotoxicology and human/mammalian toxicology published between 2007 and 2017.
We undertook searches for peer-reviewed articles in PubMed, Web of Science, Scopus, GreenFile, ScienceDirect and Toxline and included studies of controlled exposures of environmental chemical pollutants, defined as unintentional exposures leading to unintended effects. Studies with viruses, prions or therapeutic agents were excluded, as were records with missing details on chemicals’ identities, toxicities, doses, or concentrations.
To examine the internal validity of studies we developed a risk-of-bias tool tailored to mixture toxicology. For a subset of 388 entries that claimed synergisms or antagonisms, we conducted a quantitative reappraisal of authors’ evaluations by deriving ratios of predicted and observed effective mixture doses (concentrations).
Our searches produced an inventory of 1220 mixture experiments which we subjected to subgroup analyses. Approximately two thirds of studies did not incorporate more than 2 components. Most experiments relied on low-cost assays with readily quantifiable endpoints. Important toxicity outcomes of relevance for human risk assessment (e.g. carcinogenicity, genotoxicity, reproductive toxicity, immunotoxicity, neurotoxicity) were rarely addressed. The proportion of studies that declared additivity, synergism or antagonisms was approximately equal (one quarter each); the remaining quarter arrived at different evaluations. About half of the 1220 entries were rated as “definitely” or “probably” low risk of bias. Strikingly, relatively few claims of synergistic or antagonistic effects stood up to scrutiny in terms of deviations from expected additivity that exceed the boundaries of acceptable between-study variability. In most cases, the observed mixture doses were not more than two-fold higher or lower than the predicted additive doses. Twenty percent of the entries (N = 78) reported synergisms in excess of that degree of deviation. Our efforts of pinpointing specific factors that predispose to synergistic interactions confirmed previous concerns about the synergistic potential of combinations of triazine, azole and pyrethroid pesticides at environmentally relevant doses. New evidence of synergisms with endocrine disrupting chemicals and metal compounds such as chromium (VI) and nickel in combination with cadmium has emerged.
These specific cases of synergisms apart, our results confirm the utility of default application of the dose (concentration) addition concept for predictive assessments of simultaneous exposures to multiple chemicals. However, this strategy must be complemented by an awareness of the synergistic potential of specific classes of chemicals. Our conclusions only apply to the chemical space captured in published mixture studies which is biased towards relatively well-researched chemicals.
The final protocol was published on the open-access repository Zenodo and attributed the following digital object identifier, doi: https://doi.org//10.5281/zenodo.1319759 (https://zenodo.org/record/1319759#.XXIzdy7dsqM).
Currently, the identification of chemicals that have the potential to induce developmental neurotoxicity (DNT) is based on animal testing. Since at the regulatory level, systematic testing of DNT is ...not a standard requirement within the EU or USA chemical legislation safety assessment, DNT testing is only performed in higher tiered testing triggered based on chemical structure activity relationships or evidence of neurotoxicity in systemic acute or repeated dose toxicity studies. However, these triggers are rarely used and, in addition, do not always serve as reliable indicators of DNT, as they are generally based on observations in adult rodents. Therefore, there is a pressing need for developing alternative methodologies that can reliably support identification of DNT triggers, and more rapidly and cost-effectively support the identification and characterization of chemicals with DNT potential.
We propose to incorporate mechanistic knowledge and data derived from in vitro studies to support various regulatory applications including: (a) the identification of potential DNT triggers, (b) initial chemical screening and prioritization, (c) hazard identification and characterization, (d) chemical biological grouping, and (e) assessment of exposure to chemical mixtures. Ideally, currently available cellular neuronal/glial models derived from human induced pluripotent stem cells (hiPSCs) should be used as they allow evaluation of chemical impacts on key neurodevelopmental processes, by reproducing different windows of exposure during human brain development. A battery of DNT in vitro test methods derived from hiPSCs could generate valuable mechanistic data, speeding up the evaluation of thousands of compounds present in industrial, agricultural and consumer products that lack safety data on DNT potential.
•Current in vivo developmental neurotoxicity (DNT) testing is not efficient and coverage is sparse.•In vitro mechanistic data could support various regulatory applications.•Human induced pluripotent stem cell-derived neuronal models are recommended for DNT testing.•Further development of adverse outcome pathways relevant to DNT is urgently needed.•In vitro approaches should be included in regulatory DNT testing
Humans and wildlife are exposed to an intractably large number of different combinations of chemicals via food, water, air, consumer products, and other media and sources. This raises concerns about ...their impact on public and environmental health. The risk assessment of chemicals for regulatory purposes mainly relies on the assessment of individual chemicals. If exposure to multiple chemicals is considered in a legislative framework, it is usually limited to chemicals falling within this framework and co-exposure to chemicals that are covered by a different regulatory framework is often neglected. Methodologies and guidance for assessing risks from combined exposure to multiple chemicals have been developed for different regulatory sectors, however, a harmonised, consistent approach for performing mixture risk assessments and management across different regulatory sectors is lacking. At the time of this publication, several EU research projects are running, funded by the current European Research and Innovation Programme Horizon 2020 or the Seventh Framework Programme. They aim at addressing knowledge gaps and developing methodologies to better assess chemical mixtures, by generating and making available internal and external exposure data, developing models for exposure assessment, developing tools for in silico and in vitro effect assessment to be applied in a tiered framework and for grouping of chemicals, as well as developing joint epidemiological-toxicological approaches for mixture risk assessment and for prioritising mixtures of concern. The projects EDC-MixRisk, EuroMix, EUToxRisk, HBM4EU and SOLUTIONS have started an exchange between the consortia, European Commission Services and EU Agencies, in order to identify where new methodologies have become available and where remaining gaps need to be further addressed. This paper maps how the different projects contribute to the data needs and assessment methodologies and identifies remaining challenges to be further addressed for the assessment of chemical mixtures.
•Mapping EU funded research projects to different aspects of mixture risk assessment.•Overview of current status and methodological developments•Need to further address data and knowledge gaps overarching different chemical sectors
This paper reviews regulatory requirements and recent case studies to illustrate how the risk assessment (RA) of chemical mixtures is conducted, considering both the effects on human health and on ...the environment. A broad range of chemicals, regulations and RA methodologies are covered, in order to identify mixtures of concern, gaps in the regulatory framework, data needs, and further work to be carried out. Also the current and potential future use of novel tools (Adverse Outcome Pathways, in silico tools, toxicokinetic modelling, etc.) in the RA of combined effects were reviewed.
The assumptions made in the RA, predictive model specifications and the choice of toxic reference values can greatly influence the assessment outcome, and should therefore be specifically justified. Novel tools could support mixture RA mainly by providing a better understanding of the underlying mechanisms of combined effects. Nevertheless, their use is currently limited because of a lack of guidance, data, and expertise. More guidance is needed to facilitate their application. As far as the authors are aware, no prospective RA concerning chemicals related to various regulatory sectors has been performed to date, even though numerous chemicals are registered under several regulatory frameworks.
Display omitted
•“Real life” exposure comprises multiple chemicals from different sources and routes.•Chemical legislation rarely considers exposure to multiple chemicals across sectors.•Mixture risk assessment (RA) often faces exposure and hazard data gaps.•Novel alternative tools have high potential for improving mixture RA.•Need for guidance that harmonises approaches across different legislative sectors.
This paper summarizes current challenges, the potential use of novel scientific methodologies, and ways forward in the risk assessment and risk management of mixtures. Generally, methodologies to ...address mixtures have been agreed; however, there are still several data and methodological gaps to be addressed. New approach methodologies can support the filling of knowledge gaps on the toxicity and mode(s) of action of individual chemicals. (Bio)Monitoring, modeling, and better data sharing will support the derivation of more realistic co-exposure scenarios. As knowledge and data gaps often hamper an in-depth assessment of specific chemical mixtures, the option of taking account of possible mixture effects in single substance risk assessments is briefly discussed. To allow risk managers to take informed decisions, transparent documentation of assumptions and related uncertainties is recommended indicating the potential impact on the assessment. Considering the large number of possible combinations of chemicals in mixtures, prioritization is needed, so that actions first address mixtures of highest concern and chemicals that drive the mixture risk. As chemicals with different applications and regulated separately might lead to similar toxicological effects, it is important to consider chemical mixtures across legislative sectors.
In light of the vulnerability of the developing brain, mixture risk assessment (MRA) for the evaluation of developmental neurotoxicity (DNT) should be implemented, since infants and children are ...co-exposed to more than one chemical at a time. One possible approach to tackle MRA could be to cluster DNT chemicals in a mixture on the basis of their mode of action (MoA) into 'similar' and 'dissimilar', but still contributing to the same adverse outcome, and anchor DNT assays to common key events (CKEs) identified in DNT-specific adverse outcome pathways (AOPs). Moreover, the use of human in vitro models, such as induced pluripotent stem cell (hiPSC)-derived neuronal and glial cultures would enable mechanistic understanding of chemically-induced adverse effects, avoiding species extrapolation.
HiPSC-derived neural progenitors differentiated into mixed cultures of neurons and astrocytes were used to assess the effects of acute (3 days) and repeated dose (14 days) treatments with single chemicals and in mixtures belonging to different classes (i.e., lead(II) chloride and methylmercury chloride (heavy metals), chlorpyrifos (pesticide), bisphenol A (organic compound and endocrine disrupter), valproic acid (drug), and PCB138 (persistent organic pollutant and endocrine disrupter), which are associated with cognitive deficits, including learning and memory impairment in children. Selected chemicals were grouped based on their mode of action (MoA) into 'similar' and 'dissimilar' MoA compounds and their effects on synaptogenesis, neurite outgrowth, and brain derived neurotrophic factor (BDNF) protein levels, identified as CKEs in currently available AOPs relevant to DNT, were evaluated by immunocytochemistry and high content imaging analysis.
Chemicals working through similar MoA (i.e., alterations of BDNF levels), at non-cytotoxic (IC
/100), very low toxic (IC
), or moderately toxic (IC
) concentrations, induce DNT effects in mixtures, as shown by increased number of neurons, impairment of neurite outgrowth and synaptogenesis (the most sensitive endpoint as confirmed by mathematical modelling) and increase of BDNF levels, to a certain extent reproducing autism-like cellular changes observed in the brain of autistic children.
Our findings suggest that the use of human iPSC-derived mixed neuronal/glial cultures applied to a battery of assays anchored to key events of an AOP network represents a valuable approach to identify mixtures of chemicals with potential to cause learning and memory impairment in children.
The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other EU regulations, such as REACH and the Cosmetic Products Regulation advocate for a change in the way ...toxicity testing is conducted. Whilst the Cosmetic Products Regulation bans animal testing altogether, REACH aims for a progressive shift from in vivo testing towards quantitative in vitro and computational approaches. Several endpoints can already be addressed using non-animal approaches including skin corrosion and irritation, serious eye damage and irritation, skin sensitisation, and mutagenicity and genotoxicity. However, for systemic effects such as acute toxicity, repeated dose toxicity and reproductive and developmental toxicity, evaluation of chemicals under REACH still heavily relies on animal tests. Here we summarise current EU regulatory requirements for the human health assessment of chemicals under REACH and the Cosmetic Products Regulation, considering the more critical endpoints and identifying the main challenges in introducing alternative methods into regulatory testing practice. This supports a recent initiative taken by the International Cooperation on Alternative Test Methods (ICATM) to summarise current regulatory requirements specific for the assessment of chemicals and cosmetic products for several human health-related endpoints, with the aim of comparing different jurisdictions and coordinating the promotion and ultimately the implementation of non-animal approaches worldwide. Recent initiatives undertaken at European level to promote the 3Rs and the use of alternative methods in current regulatory practice are also discussed.
Per- and polyfluoroalkyl substances (PFASs) are man-made chemicals that contain at least one perfluoroalkyl moiety, Formula: see text. To date, over 4,000 unique PFASs have been used in technical ...applications and consumer products, and some of them have been detected globally in human and wildlife biomonitoring studies. Because of their extraordinary persistence, human and environmental exposure to PFASs will be a long-term source of concern. Some PFASs such as perfluorooctanoic acid (PFOA) and perfluorooctanesulfonic acid (PFOS) have been investigated extensively and thus regulated, but for many other PFASs, knowledge about their current uses and hazards is still very limited or missing entirely. To address this problem and prepare an action plan for the assessment and management of PFASs in the coming years, a group of more than 50 international scientists and regulators held a two-day workshop in November, 2017. The group identified both the respective needs of and common goals shared by the scientific and the policy communities, made recommendations for cooperative actions, and outlined how the science-policy interface regarding PFASs can be strengthened using new approaches for assessing and managing highly persistent chemicals such as PFASs. https://doi.org/10.1289/EHP4158.
Diatoms are unicellular, photosynthetic, eukaryotic algae with a ubiquitous distribution in water environments and they play an important role in the carbon cycle. Molecular or morphological changes ...in these species under ecological stress conditions are expected to serve as early indicators of toxicity and can point to a global impact on the entire ecosystem. Thalassiosira pseudonana, a marine diatom and the first with a fully sequenced genome has been selected as an aquatic model organism for ecotoxicological studies using molecular tools. A customized DNA microarray containing probes for the available gene sequences has been developed and tested to analyze the effects of a common pollutant, benzo(a)pyrene (BaP), at a sub-lethal concentration. This approach in diatoms has helped to elucidate pathway/metabolic processes involved in the mode of action of this pollutant, including lipid metabolism, silicon metabolism and stress response. A dose-response of BaP on diatoms has been made and the effect of this compound on the expression of selected genes was assessed by quantitative real time-PCR. Up-regulation of the long-chain acyl-CoA synthetase and the anti-apoptotic transmembrane Bax inhibitor, as well as down-regulation of silicon transporter 1 and a heat shock factor was confirmed at lower concentrations of BaP, but not the heat-shock protein 20. The study has allowed the identification of molecular biomarkers to BaP to be later on integrated into environmental monitoring for water quality assessment.
•Human Biomonitoring (HBM) provides valuable insight into co-exposure to multiple chemicals.•HBM data can be interpreted in comparison to biomonitoring equivalents of health based guidance ...values.•Two generic physiologically based kinetic models were tested for deriving biomonitoring equivalents.•Uncertainties and limitations were identified and discussed.•The use of biomonitoring equivalents in assessing chemical mixtures was illustrated in a case study.
Human biomonitoring (HBM) data can provide insight into co-exposure patterns resulting from exposure to multiple chemicals from various sources and over time. Therefore, such data are particularly valuable for assessing potential risks from combined exposure to multiple chemicals.
One way to interpret HBM data is establishing safe levels in blood or urine, called Biomonitoring Equivalents (BE) or HBM health based guidance values (HBM-HBGV). These can be derived by converting established external reference values, such as tolerable daily intake (TDI) values. HBM-HBGV or BE values are so far agreed only for a very limited number of chemicals. These values can be established using physiologically based kinetic (PBK) modelling, usually requiring substance specific models and the collection of many input parameters which are often not available or difficult to find in the literature.
The aim of this study was to investigate the suitability and limitations of generic PBK models in deriving BE values for several compounds with a view to facilitating the use of HBM data in the assessment of chemical mixtures at a screening level. The focus was on testing the methodology with two generic models, the IndusChemFate tool and High-Throughput Toxicokinetics package, for two different classes of compounds, phenols and phthalates. HBM data on Danish children and on Norwegian mothers and children were used to evaluate the quality of the predictions and to illustrate, by means of a case study, the overall approach of applying PBK models to chemical classes with HBM data in the context of chemical mixture risk assessment.
Application of PBK models provides a better understanding and interpretation of HBM data. However, the study shows that establishing safety threshold levels in urine is a difficult and complex task. The approach might be more straightforward for more persistent chemicals that are analysed as parent compounds in blood but high uncertainties have to be considered around simulated metabolite concentrations in urine. Refining the models may reduce these uncertainties and improve predictions. Based on the experience gained with this study, the performance of the models for other chemicals could be investigated, to improve the accuracy of the simulations.