In 2008, the National Institute of Environmental Health Sciences/National Toxicology Program, the U.S. Environmental Protection Agency's National Center for Computational Toxicology, and the National ...Human Genome Research Institute/National Institutes of Health Chemical Genomics Center entered into an agreement on "high throughput screening, toxicity pathway profiling, and biological interpretation of findings." In 2010, the U.S. Food and Drug Administration (FDA) joined the collaboration, known informally as Tox21.
The Tox21 partners agreed to develop a vision and devise an implementation strategy to shift the assessment of chemical hazards away from traditional experimental animal toxicology studies to one based on target-specific, mechanism-based, biological observations largely obtained using in vitro assays.
Here we outline the efforts of the Tox21 partners up to the time the FDA joined the collaboration, describe the approaches taken to develop the science and technologies that are currently being used, assess the current status, and identify problems that could impede further progress as well as suggest approaches to address those problems.
Tox21 faces some very difficult issues. However, we are making progress in integrating data from diverse technologies and end points into what is effectively a systems-biology approach to toxicology. This can be accomplished only when comprehensive knowledge is obtained with broad coverage of chemical and biological/toxicological space. The efforts thus far reflect the initial stage of an exceedingly complicated program, one that will likely take decades to fully achieve its goals. However, even at this stage, the information obtained has attracted the attention of the international scientific community, and we believe these efforts foretell the future of toxicology.
A recent review by the International Agency for Research on Cancer (IARC) updated the assessments of the > 100 agents classified as Group 1, carcinogenic to humans (IARC Monographs Volume 100, parts ...A-F). This exercise was complicated by the absence of a broadly accepted, systematic method for evaluating mechanistic data to support conclusions regarding human hazard from exposure to carcinogens.
IARC therefore convened two workshops in which an international Working Group of experts identified 10 key characteristics, one or more of which are commonly exhibited by established human carcinogens.
These characteristics provide the basis for an objective approach to identifying and organizing results from pertinent mechanistic studies. The 10 characteristics are the abilities of an agent to 1) act as an electrophile either directly or after metabolic activation; 2) be genotoxic; 3) alter DNA repair or cause genomic instability; 4) induce epigenetic alterations; 5) induce oxidative stress; 6) induce chronic inflammation; 7) be immunosuppressive; 8) modulate receptor-mediated effects; 9) cause immortalization; and 10) alter cell proliferation, cell death, or nutrient supply.
We describe the use of the 10 key characteristics to conduct a systematic literature search focused on relevant end points and construct a graphical representation of the identified mechanistic information. Next, we use benzene and polychlorinated biphenyls as examples to illustrate how this approach may work in practice. The approach described is similar in many respects to those currently being implemented by the U.S. EPA's Integrated Risk Information System Program and the U.S. National Toxicology Program.
Smith MT, Guyton KZ, Gibbons CF, Fritz JM, Portier CJ, Rusyn I, DeMarini DM, Caldwell JC, Kavlock RJ, Lambert P, Hecht SS, Bucher JR, Stewart BW, Baan R, Cogliano VJ, Straif K. 2016. Key characteristics of carcinogens as a basis for organizing data on mechanisms of carcinogenesis. Environ Health Perspect 124:713-721; http://dx.doi.org/10.1289/ehp.1509912.
•The Tox21 effort is to advance in vitro toxicological testing in the 21st century.•Tox21 chemical library contains approximately 10,000 environmental chemicals.•A battery of in vitro assays will be ...validated and screened in a qHTS platform.•Tox21 robotic system is capable of screening the Tox21 chemical library in triplicates in a week.
Since its establishment in 2008, the US Tox21 inter-agency collaboration has made great progress in developing and evaluating cellular models for the evaluation of environmental chemicals as a proof of principle. Currently, the program has entered its production phase (Tox21 Phase II) focusing initially on the areas of modulation of nuclear receptors and stress response pathways. During Tox21 Phase II, the set of chemicals to be tested has been expanded to nearly 10,000 (10K) compounds and a fully automated screening platform has been implemented. The Tox21 robotic system combined with informatics efforts is capable of screening and profiling the collection of 10K environmental chemicals in triplicate in a week. In this article, we describe the Tox21 screening process, compound library preparation, data processing, and robotic system validation.
Changes in chemical regulations worldwide have increased the demand for new data on chemical safety. New approach methodologies (NAMs) are defined broadly here as including in silico approaches and ...in chemico and in vitro assays, as well as the inclusion of information from the exposure of chemicals in the context of hazard European Chemicals Agency, “New Approach Methodologies in Regulatory Science”, 2016. NAMs for toxicity testing, including alternatives to animal testing approaches, have shown promise to provide a large amount of data to fill information gaps in both hazard and exposure. In order to increase experience with the new data and to advance the applications of NAM data to evaluate the safety of data-poor chemicals, demonstration case studies have to be developed to build confidence in their usability. Case studies can be used to explore the domains of applicability of the NAM data and identify areas that would benefit from further research, development, and application. To ensure that this science evolves with direct input from and engagement by risk managers and regulatory decision makers, a workshop was convened among senior leaders from international regulatory agencies to identify common barriers for using NAMs and to propose next steps to address them. Central to the workshop were a series of collaborative case studies designed to explore areas where the benefits of NAM data could be demonstrated. These included use of in vitro bioassays data in combination with exposure estimates to derive a quantitative assessment of risk, use of NAMs for updating chemical categorizations, and use of NAMs to increase understanding of exposure and human health toxicity of various chemicals. The case study approach proved effective in building collaborations and engagement with regulatory decision makers and to promote the importance of data and knowledge sharing among international regulatory agencies. The case studies will be continued to explore new ways of describing hazard (i.e., pathway perturbations as a measure of adversity) and new ways of describing risk (i.e., using NAMs to identify protective levels without necessarily being predictive of a specific hazard). Importantly, the case studies also highlighted the need for increased training and communication across the various communities including the risk assessors, regulators, stakeholders (e.g., industry, non-governmental organizations), and the general public. The development and application of NAMs will play an increasing role in filling important data gaps on the safety of chemicals, but confidence in NAMs will only come with learning by doing and sharing in the experience.
Since 2009, the Tox21 project has screened ∼8500 chemicals in more than 70 high-throughput assays, generating upward of 100 million data points, with all data publicly available through partner ...websites at the United States Environmental Protection Agency (EPA), National Center for Advancing Translational Sciences (NCATS), and National Toxicology Program (NTP). Underpinning this public effort is the largest compound library ever constructed specifically for improving understanding of the chemical basis of toxicity across research and regulatory domains. Each Tox21 federal partner brought specialized resources and capabilities to the partnership, including three approximately equal-sized compound libraries. All Tox21 data generated to date have resulted from a confluence of ideas, technologies, and expertise used to design, screen, and analyze the Tox21 10K library. The different programmatic objectives of the partners led to three distinct, overlapping compound libraries that, when combined, not only covered a diversity of chemical structures, use-categories, and properties but also incorporated many types of compound replicates. The history of development of the Tox21 “10K” chemical library and data workflows implemented to ensure quality chemical annotations and allow for various reproducibility assessments are described. Cheminformatics profiling demonstrates how the three partner libraries complement one another to expand the reach of each individual library, as reflected in coverage of regulatory lists, predicted toxicity end points, and physicochemical properties. ToxPrint chemotypes (CTs) and enrichment approaches further demonstrate how the combined partner libraries amplify structure–activity patterns that would otherwise not be detected. Finally, CT enrichments are used to probe global patterns of activity in combined ToxCast and Tox21 activity data sets relative to test-set size and chemical versus biological end point diversity, illustrating the power of CT approaches to discern patterns in chemical–activity data sets. These results support a central premise of the Tox21 program: A collaborative merging of programmatically distinct compound libraries would yield greater rewards than could be achieved separately.
Background: The prioritization of chemicals for toxicity testing is a primary goal of the U.S. Environmental Protection Agency (EPA) ToxCast™ program. Phase I of ToxCast used a battery of 467 in ...vitro, high-throughput screening assays to assess 309 environmental chemicals. One important mode of action leading to toxicity is endocrine disruption, and the U.S. EPA's Endocrine Disraptor Screening Program (EDSP) has been charged with screening pesticide chemicals and environmental contaminants for their potential to affect the endocrine systems of humans and wildlife. Objective: The goal of this study was to develop a flexible method to facilitate the rational prioritization of chemicals for further evaluation and demonstrate its application as a candidate decisionsupport tool for EDSP. Methods: Focusing on estrogen, androgen, and thyroid pathways, we defined putative endocrine profiles and derived a relative rank or score for the entire ToxCast library of 309 unique chemicals. Effects on other nuclear receptors and xenobiotic metabolizing enzymes were also considered, as were pertinent chemical descriptors and pathways relevant to endocrine-mediated signaling. Results: Combining multiple data sources into an overall, weight-of-evidence Toxicological Priority Index (ToxPi) score for prioritizing further chemical testing resulted in more robust conclusions than any single data source taken alone. Conclusions: Incorporating data from in vitro assays, chemical descriptors, and biological pathways in this prioritization schema provided a flexible, comprehensive visualization and ranking of each chemical's potential endocrine activity. Importantly, ToxPi profiles provide a transparent visualization of the relative contribution of all information sources to an overall priority ranking, lhe method developed here is readily adaptable to diverse chemical prioritization tasks.
The U.S. Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS), and various toxicogenomic technologies to predict ...potential for toxicity and prioritize limited testing resources toward chemicals that likely represent the greatest hazard to human health and the environment. This chemical prioritization research program, entitled “ToxCast,” is being initiated with the purpose of developing the ability to forecast toxicity based on bioactivity profiling. The proof-of-concept phase of ToxCast will focus upon chemicals with an existing, rich toxicological database in order to provide an interpretive context for the ToxCast data. This set of several hundred reference chemicals will represent numerous structural classes and phenotypic outcomes, including tumorigens, developmental and reproductive toxicants, neurotoxicants, and immunotoxicants. The ToxCast program will evaluate chemical properties and bioactivity profiles across a broad spectrum of data domains: physical-chemical, predicted biological activities based on existing structure-activity models, biochemical properties based on HTS assays, cell-based phenotypic assays, and genomic and metabolomic analyses of cells. These data will be generated through a series of external contracts, along with collaborations across EPA, with the National Toxicology Program, and with the National Institutes of Health Chemical Genomics Center. The resulting multidimensional data set provides an informatics challenge requiring appropriate computational methods for integrating various chemical, biological, and toxicological data into profiles and models predicting toxicity.
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological ...organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.
High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent ...potential in vivo effects of these chemicals due to differences in bioavailability, clearance, and exposure. Hepatic metabolic clearance and plasma protein binding were experimentally measured for 239 ToxCast Phase I chemicals. The experimental data were used in a population-based in vitro-to-in vivo extrapolation model to estimate the daily human oral dose, called the oral equivalent dose, necessary to produce steady-state in vivo blood concentrations equivalent to in vitro AC50 (concentration at 50% of maximum activity) or lowest effective concentration values across more than 500 in vitro assays. The estimated steady-state oral equivalent doses associated with the in vitro assays were compared with chronic aggregate human oral exposure estimates to assess whether in vitro bioactivity would be expected at the dose-equivalent level of human exposure. A total of 18 (9.9%) chemicals for which human oral exposure estimates were available had oral equivalent doses at levels equal to or less than the highest estimated U.S. population exposures. Ranking the chemicals by nominal assay concentrations would have resulted in different chemicals being prioritized. The in vitro assay endpoints with oral equivalent doses lower than the human exposure estimates included cell growth kinetics, cytokine and cytochrome P450 expression, and cytochrome P450 inhibition. The incorporation of dosimetry and exposure provide necessary context for interpretation of in vitro toxicity screening data and are important considerations in determining chemical testing priorities.
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, ...high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (
U.S. EPA, 2003
), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (
U.S. EPA, 2009a
). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.