In 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the U.S. Environmental Protection Agency's National Center for Computational Toxicology, and the National ...Human Genome Research Institute/National Institutes of Health Chemical Genomics Center entered into an agreement on "high throughput screening, toxicity pathway profiling, and biological interpretation of findings." In 2010, the U.S. Food and Drug Administration (FDA) joined the collaboration, known informally as Tox21.
The Tox21 partners agreed to develop a vision and devise an implementation strategy to shift the assessment of chemical hazards away from traditional experimental animal toxicology studies to one based on target-specific, mechanism-based, biological observations largely obtained using in vitro assays.
Here we outline the efforts of the Tox21 partners up to the time the FDA joined the collaboration, describe the approaches taken to develop the science and technologies that are currently being used, assess the current status, and identify problems that could impede further progress as well as suggest approaches to address those problems.
Tox21 faces some very difficult issues. However, we are making progress in integrating data from diverse technologies and end points into what is effectively a systems-biology approach to toxicology. This can be accomplished only when comprehensive knowledge is obtained with broad coverage of chemical and biological/toxicological space. The efforts thus far reflect the initial stage of an exceedingly complicated program, one that will likely take decades to fully achieve its goals. However, even at this stage, the information obtained has attracted the attention of the international scientific community, and we believe these efforts foretell the future of toxicology.
A recent review by the International Agency for Research on Cancer (IARC) updated the assessments of the > 100 agents classified as Group 1, carcinogenic to humans (IARC Monographs Volume 100, parts ...A-F). This exercise was complicated by the absence of a broadly accepted, systematic method for evaluating mechanistic data to support conclusions regarding human hazard from exposure to carcinogens.
IARC therefore convened two workshops in which an international Working Group of experts identified 10 key characteristics, one or more of which are commonly exhibited by established human carcinogens.
These characteristics provide the basis for an objective approach to identifying and organizing results from pertinent mechanistic studies. The 10 characteristics are the abilities of an agent to 1) act as an electrophile either directly or after metabolic activation; 2) be genotoxic; 3) alter DNA repair or cause genomic instability; 4) induce epigenetic alterations; 5) induce oxidative stress; 6) induce chronic inflammation; 7) be immunosuppressive; 8) modulate receptor-mediated effects; 9) cause immortalization; and 10) alter cell proliferation, cell death, or nutrient supply.
We describe the use of the 10 key characteristics to conduct a systematic literature search focused on relevant end points and construct a graphical representation of the identified mechanistic information. Next, we use benzene and polychlorinated biphenyls as examples to illustrate how this approach may work in practice. The approach described is similar in many respects to those currently being implemented by the U.S. EPA's Integrated Risk Information System Program and the U.S. National Toxicology Program.
Smith MT, Guyton KZ, Gibbons CF, Fritz JM, Portier CJ, Rusyn I, DeMarini DM, Caldwell JC, Kavlock RJ, Lambert P, Hecht SS, Bucher JR, Stewart BW, Baan R, Cogliano VJ, Straif K. 2016. Key characteristics of carcinogens as a basis for organizing data on mechanisms of carcinogenesis. Environ Health Perspect 124:713-721; http://dx.doi.org/10.1289/ehp.1509912.
•The Tox21 effort is to advance in vitro toxicological testing in the 21st century.•Tox21 chemical library contains approximately 10,000 environmental chemicals.•A battery of in vitro assays will be ...validated and screened in a qHTS platform.•Tox21 robotic system is capable of screening the Tox21 chemical library in triplicates in a week.
Since its establishment in 2008, the US Tox21 inter-agency collaboration has made great progress in developing and evaluating cellular models for the evaluation of environmental chemicals as a proof of principle. Currently, the program has entered its production phase (Tox21 Phase II) focusing initially on the areas of modulation of nuclear receptors and stress response pathways. During Tox21 Phase II, the set of chemicals to be tested has been expanded to nearly 10,000 (10K) compounds and a fully automated screening platform has been implemented. The Tox21 robotic system combined with informatics efforts is capable of screening and profiling the collection of 10K environmental chemicals in triplicate in a week. In this article, we describe the Tox21 screening process, compound library preparation, data processing, and robotic system validation.
Background: Thirty years of pesticide registration toxicity data have been historically stored as hardcopy and scanned documents by the U.S. Environmental Protection Agency (EPA). A significant ...portion of these data have now been processed into standardized and structured toxicity data within the EPA's Toxicity Reference Database (ToxRefDB), including chronic, cancer, developmental, and reproductive studies from laboratory animals. These data are now accessible and mineable within ToxRefDB and are serving as a primary source of validation for U.S. EPA's ToxCast research program in predictive toxicology. Objectives: We profiled in vivo toxicities across 310 chemicals as a model application of ToxRefDB, meeting the need for detailed anchoring end points for development of ToxCast predictive signatures. Methods: Using query and structured data-mining approaches, we generated toxicity profiles from ToxRefDB based on long-term rodent bioassays. These chronic/cancer data were analyzed for suitability as anchoring end points based on incidence, target organ, severity, potency, and significance. Results: Under conditions of the bioassays, we observed pathologies for 273 of 310 chemicals, with greater preponderance (> 90%) occurring in the liver, kidney, thyroid, lung, testis, and spleen. We observed proliferative lesions for 225 chemicals, and 167 chemicals caused progression to cancer-related pathologies. Conclusions: Based on incidence, severity, and potency, we selected 26 primarily tissue-specific pathology end points to uniformly classify the 310 chemicals. The resulting toxicity profile classifications demonstrate the utility of structuring legacy toxicity information and facilitating the computation of these data within ToxRefDB for ToxCast and other applications.
High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent ...potential in vivo effects of these chemicals due to differences in bioavailability, clearance, and exposure. Hepatic metabolic clearance and plasma protein binding were experimentally measured for 239 ToxCast Phase I chemicals. The experimental data were used in a population-based in vitro-to-in vivo extrapolation model to estimate the daily human oral dose, called the oral equivalent dose, necessary to produce steady-state in vivo blood concentrations equivalent to in vitro AC50 (concentration at 50% of maximum activity) or lowest effective concentration values across more than 500 in vitro assays. The estimated steady-state oral equivalent doses associated with the in vitro assays were compared with chronic aggregate human oral exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. A total of 18 (9.9%) chemicals for which human oral exposure estimates were available had oral equivalent doses at levels equal to or less than the highest estimated U.S. population exposures. Ranking the chemicals by nominal assay concentrations would have resulted in different chemicals being prioritized. The in vitro assay endpoints with oral equivalent doses lower than the human exposure estimates included cell growth kinetics, cytokine and cytochrome P450 expression, and cytochrome P450 inhibition. The incorporation of dosimetry and exposure provide necessary context for interpretation of in vitro toxicity screening data and are important considerations in determining chemical testing priorities.
Background: The prioritization of chemicals for toxicity testing is a primary goal of the U.S. Environmental Protection Agency (EPA) ToxCast™ program. Phase I of ToxCast used a battery of 467 in ...vitro, high-throughput screening assays to assess 309 environmental chemicals. One important mode of action leading to toxicity is endocrine disruption, and the U.S. EPA's Endocrine Disraptor Screening Program (EDSP) has been charged with screening pesticide chemicals and environmental contaminants for their potential to affect the endocrine systems of humans and wildlife. Objective: The goal of this study was to develop a flexible method to facilitate the rational prioritization of chemicals for further evaluation and demonstrate its application as a candidate decisionsupport tool for EDSP. Methods: Focusing on estrogen, androgen, and thyroid pathways, we defined putative endocrine profiles and derived a relative rank or score for the entire ToxCast library of 309 unique chemicals. Effects on other nuclear receptors and xenobiotic metabolizing enzymes were also considered, as were pertinent chemical descriptors and pathways relevant to endocrine-mediated signaling. Results: Combining multiple data sources into an overall, weight-of-evidence Toxicological Priority Index (ToxPi) score for prioritizing further chemical testing resulted in more robust conclusions than any single data source taken alone. Conclusions: Incorporating data from in vitro assays, chemical descriptors, and biological pathways in this prioritization schema provided a flexible, comprehensive visualization and ranking of each chemical's potential endocrine activity. Importantly, ToxPi profiles provide a transparent visualization of the relative contribution of all information sources to an overall priority ranking, lhe method developed here is readily adaptable to diverse chemical prioritization tasks.
Changes in chemical regulations worldwide have increased the demand for new data on chemical safety. New approach methodologies (NAMs) are defined broadly here as including in silico approaches and ...in chemico and in vitro assays, as well as the inclusion of information from the exposure of chemicals in the context of hazard European Chemicals Agency, “New Approach Methodologies in Regulatory Science”, 2016. NAMs for toxicity testing, including alternatives to animal testing approaches, have shown promise to provide a large amount of data to fill information gaps in both hazard and exposure. In order to increase experience with the new data and to advance the applications of NAM data to evaluate the safety of data-poor chemicals, demonstration case studies have to be developed to build confidence in their usability. Case studies can be used to explore the domains of applicability of the NAM data and identify areas that would benefit from further research, development, and application. To ensure that this science evolves with direct input from and engagement by risk managers and regulatory decision makers, a workshop was convened among senior leaders from international regulatory agencies to identify common barriers for using NAMs and to propose next steps to address them. Central to the workshop were a series of collaborative case studies designed to explore areas where the benefits of NAM data could be demonstrated. These included use of in vitro bioassays data in combination with exposure estimates to derive a quantitative assessment of risk, use of NAMs for updating chemical categorizations, and use of NAMs to increase understanding of exposure and human health toxicity of various chemicals. The case study approach proved effective in building collaborations and engagement with regulatory decision makers and to promote the importance of data and knowledge sharing among international regulatory agencies. The case studies will be continued to explore new ways of describing hazard (i.e., pathway perturbations as a measure of adversity) and new ways of describing risk (i.e., using NAMs to identify protective levels without necessarily being predictive of a specific hazard). Importantly, the case studies also highlighted the need for increased training and communication across the various communities including the risk assessors, regulators, stakeholders (e.g., industry, non-governmental organizations), and the general public. The development and application of NAMs will play an increasing role in filling important data gaps on the safety of chemicals, but confidence in NAMs will only come with learning by doing and sharing in the experience.
The U.S. Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS), and various toxicogenomic technologies to predict ...potential for toxicity and prioritize limited testing resources toward chemicals that likely represent the greatest hazard to human health and the environment. This chemical prioritization research program, entitled “ToxCast,” is being initiated with the purpose of developing the ability to forecast toxicity based on bioactivity profiling. The proof-of-concept phase of ToxCast will focus upon chemicals with an existing, rich toxicological database in order to provide an interpretive context for the ToxCast data. This set of several hundred reference chemicals will represent numerous structural classes and phenotypic outcomes, including tumorigens, developmental and reproductive toxicants, neurotoxicants, and immunotoxicants. The ToxCast program will evaluate chemical properties and bioactivity profiles across a broad spectrum of data domains: physical-chemical, predicted biological activities based on existing structure-activity models, biochemical properties based on HTS assays, cell-based phenotypic assays, and genomic and metabolomic analyses of cells. These data will be generated through a series of external contracts, along with collaborations across EPA, with the National Toxicology Program, and with the National Institutes of Health Chemical Genomics Center. The resulting multidimensional data set provides an informatics challenge requiring appropriate computational methods for integrating various chemical, biological, and toxicological data into profiles and models predicting toxicity.
Since 2009, the Tox21 project has screened ∼8500 chemicals in more than 70 high-throughput assays, generating upward of 100 million data points, with all data publicly available through partner ...websites at the United States Environmental Protection Agency (EPA), National Center for Advancing Translational Sciences (NCATS), and National Toxicology Program (NTP). Underpinning this public effort is the largest compound library ever constructed specifically for improving understanding of the chemical basis of toxicity across research and regulatory domains. Each Tox21 federal partner brought specialized resources and capabilities to the partnership, including three approximately equal-sized compound libraries. All Tox21 data generated to date have resulted from a confluence of ideas, technologies, and expertise used to design, screen, and analyze the Tox21 10K library. The different programmatic objectives of the partners led to three distinct, overlapping compound libraries that, when combined, not only covered a diversity of chemical structures, use-categories, and properties but also incorporated many types of compound replicates. The history of development of the Tox21 “10K” chemical library and data workflows implemented to ensure quality chemical annotations and allow for various reproducibility assessments are described. Cheminformatics profiling demonstrates how the three partner libraries complement one another to expand the reach of each individual library, as reflected in coverage of regulatory lists, predicted toxicity end points, and physicochemical properties. ToxPrint chemotypes (CTs) and enrichment approaches further demonstrate how the combined partner libraries amplify structure–activity patterns that would otherwise not be detected. Finally, CT enrichments are used to probe global patterns of activity in combined ToxCast and Tox21 activity data sets relative to test-set size and chemical versus biological end point diversity, illustrating the power of CT approaches to discern patterns in chemical–activity data sets. These results support a central premise of the Tox21 program: A collaborative merging of programmatically distinct compound libraries would yield greater rewards than could be achieved separately.
Background: The large and increasing number of chemicals released into the environment demands more efficient and cost-effective approaches for assessing environmental chemical toxicity. The U.S. ...Tox21 program has responded to this challenge by proposing alternative strategies for toxicity testing, among which the quantitative high-throughput screening (qHTS) paradigm has been adopted as the primary tool for generating data from screening large chemical libraries using a wide spectrum of assays. Objectives: The goal of this study was to develop methods to evaluate the data generated from these assays to guide future assay selection and prioritization for the Tox21 program. Methods: We examined the data from the Tox21 pilot-phase collection of approximately 3,000 environmental chemicals profiled in qHTS format against a panel of 10 human nuclear receptors (AR, ERα, FXR, GR, LXRβ, PPARγ, PPARδ, RXRα, TRβ, and VDR) for reproducibility, concordance of biological activity profiles with sequence homology of the receptor ligand binding domains, and structure-activity relationships. Results: We determined the assays to be appropriate in terms of biological relevance. We found better concordance for replicate compounds for the agonist-mode than for the antagonist-mode assays, likely due to interference of cytotoxicity in the latter assays. This exercise also enabled us to formulate data-driven strategies for discriminating true signals from artifacts, and to prioritize assays based on data quality. Conclusions: The results demonstrate the feasibility of qHTS to identify the potential for environmentally relevant chemicals to interact with key toxicity pathways related to human disease induction.