Statistical-based and expert rule-based models built using public domain mutagenicity knowledge and data are routinely used for computational (Q)SAR assessments of pharmaceutical impurities in line ...with the approach recommended in the ICH M7 guideline. Knowledge from proprietary corporate mutagenicity databases could be used to increase the predictive performance for selected chemical classes as well as expand the applicability domain of these (Q)SAR models. This paper outlines a mechanism for sharing knowledge without the release of proprietary data. Primary aromatic amine mutagenicity was selected as a case study because this chemical class is often encountered in pharmaceutical impurity analysis and mutagenicity of aromatic amines is currently difficult to predict. As part of this analysis, a series of aromatic amine substructures were defined and the number of mutagenic and non-mutagenic examples for each chemical substructure calculated across a series of public and proprietary mutagenicity databases. This information was pooled across all sources to identify structural classes that activate or deactivate aromatic amine mutagenicity. This structure activity knowledge, in combination with newly released primary aromatic amine data, was incorporated into Leadscope's expert rule-based and statistical-based (Q)SAR models where increased predictive performance was demonstrated.
•Describes how proprietary databases can be used to improve public (Q)SAR models.•Outlines a SAR fingerprint methodology for sharing SAR knowledge.•Describes a case study to improve primary aromatic amine mutagenicity prediction.•Lists features that activate and deactivate aromatic amine mutagenicity.
The InnoMed PredTox consortium was formed to evaluate whether conventional preclinical safety assessment can be significantly enhanced by incorporation of molecular profiling (“omics”) technologies. ...In short-term toxicological studies in rats, transcriptomics, proteomics and metabolomics data were collected and analyzed in relation to routine clinical chemistry and histopathology. Four of the sixteen hepato- and/or nephrotoxicants given to rats for 1, 3, or 14
days at two dose levels induced similar histopathological effects. These were characterized by bile duct necrosis and hyperplasia and/or increased bilirubin and cholestasis, in addition to hepatocyte necrosis and regeneration, hepatocyte hypertrophy, and hepatic inflammation. Combined analysis of liver transcriptomics data from these studies revealed common gene expression changes which allowed the development of a potential sequence of events on a mechanistic level in accordance with classical endpoint observations. This included genes implicated in early stress responses, regenerative processes, inflammation with inflammatory cell immigration, fibrotic processes, and cholestasis encompassing deregulation of certain membrane transporters. Furthermore, a preliminary classification analysis using transcriptomics data suggested that prediction of cholestasis may be possible based on gene expression changes seen at earlier time-points. Targeted bile acid analysis, based on LC-MS metabonomics data demonstrating increased levels of conjugated or unconjugated bile acids in response to individual compounds, did not provide earlier detection of toxicity as compared to conventional parameters, but may allow distinction of different types of hepatobiliary toxicity. Overall, liver transcriptomics data delivered mechanistic and molecular details in addition to the classical endpoint observations which were further enhanced by targeted bile acid analysis using LC/MS metabonomics.
•Summary of the biological mechanisms and processes underpinning hepatotoxicity.•Description of experimental approaches to support the prediction of hepatotoxicity.•Discussion of the role of in ...silico approaches highlighting challenges to the adoption of these methods.•Proposed framework for the integration of in silico and experimental information.
Hepatotoxicity is one of the most frequently observed adverse effects resulting from exposure to a xenobiotic. For example, in pharmaceutical research and development it is one of the major reasons for drug withdrawals, clinical failures, and discontinuation of drug candidates. The development of faster and cheaper methods to assess hepatotoxicity that are both more sustainable and more informative is critically needed. The biological mechanisms and processes underpinning hepatotoxicity are summarized and experimental approaches to support the prediction of hepatotoxicity are described, including toxicokinetic considerations. The paper describes the increasingly important role of in silico approaches and highlights challenges to the adoption of these methods including the lack of a commonly agreed upon protocol for performing such an assessment and the need for in silico solutions that take dose into consideration. A proposed framework for the integration of in silico and experimental information is provided along with a case study describing how computational methods have been used to successfully respond to a regulatory question concerning non-genotoxic impurities in chemically synthesized pharmaceuticals.
•Key characteristics/mechanisms of heart, lung and kidney toxicity are presented.•Computational methods used to predict such organ toxicity are discussed.•Data gaps and challenges for the prediction ...of organ toxicity are reviewed.
The kidneys, heart and lungs are vital organ systems evaluated as part of acute or chronic toxicity assessments. New methodologies are being developed to predict these adverse effects based on in vitro and in silico approaches. This paper reviews the current state of the art in predicting these organ toxicities. It outlines the biological basis, processes and endpoints for kidney toxicity, pulmonary toxicity, respiratory irritation and sensitization as well as functional and structural cardiac toxicities. The review also covers current experimental approaches, including off-target panels from secondary pharmacology batteries. Current in silico approaches for prediction of these effects and mechanisms are described as well as obstacles to the use of in silico methods. Ultimately, a commonly accepted protocol for performing such assessment would be a valuable resource to expand the use of such approaches across different regulatory and industrial applications. However, a number of factors impede their widespread deployment including a lack of a comprehensive mechanistic understanding, limited in vitro testing approaches and limited in vivo databases suitable for modeling, a limited understanding of how to incorporate absorption, distribution, metabolism, and excretion (ADME) considerations into the overall process, a lack of in silico models designed to predict a safe dose and an accepted framework for organizing the key characteristics of these organ toxicants.
•Summarizes the 10 key characteristics (KCs) of carcinogens.•Assesses how current in silico methods address each of the KCs.•Indicates where experimental methods need to be implemented and robust ...databases generated.•Highlights interactions among the KCs with the different stages of carcinogenesis.
Historically, identifying carcinogens has relied primarily on tumor studies in rodents, which require enormous resources in both money and time. In silico models have been developed for predicting rodent carcinogens but have not yet found general regulatory acceptance, in part due to the lack of a generally accepted protocol for performing such an assessment as well as limitations in predictive performance and scope. There remains a need for additional, improved in silico carcinogenicity models, especially ones that are more human-relevant, for use in research and regulatory decision-making. As part of an international effort to develop in silico toxicological protocols, a consortium of toxicologists, computational scientists, and regulatory scientists across several industries and governmental agencies evaluated the extent to which in silico models exist for each of the recently defined 10 key characteristics (KCs) of carcinogens. This position paper summarizes the current status of in silico tools for the assessment of each KC and identifies the data gaps that need to be addressed before a comprehensive in silico carcinogenicity protocol can be developed for regulatory use.
Acute toxicity in silico models are being used to support an increasing number of application areas including (1) product research and development, (2) product approval and registration as well as ...(3) the transport, storage and handling of chemicals. The adoption of such models is being hindered, in part, because of a lack of guidance describing how to perform and document an in silico analysis. To address this issue, a framework for an acute toxicity hazard assessment is proposed. This framework combines results from different sources including in silico methods and in vitro or in vivo experiments. In silico methods that can assist the prediction of in vivo outcomes (i.e., LD50) are analyzed concluding that predictions obtained using in silico approaches are now well-suited for reliably supporting assessment of LD50-based acute toxicity for the purpose of the Globally Harmonized System (GHS) classification. A general overview is provided of the endpoints from in vitro studies commonly evaluated for predicting acute toxicity (e.g., cytotoxicity/cytolethality as well as assays targeting specific mechanisms). The increased understanding of pathways and key triggering mechanisms underlying toxicity and the increased availability of in vitro data allow for a shift away from assessments solely based on endpoints such as LD50, to mechanism-based endpoints that can be accurately assessed in vitro or by using in silico prediction models. This paper also highlights the importance of an expert review of all available information using weight-of-evidence considerations and illustrates, using a series of diverse practical use cases, how in silico approaches support the assessment of acute toxicity.
Display omitted
•Data sharing helps to challenge and clarify the rationale for mutagenic activity.•Reasons for conflicting in silico and in vitro mutagenicity calls are examined.•Expert analysis ...should be applied equally to in silico tools and in vitro assays.
Primary aromatic amines (pAAs) are attractive building blocks in medicinal chemistry programmes yet their potential for mutagenic activity causes real concern owing to the risk of genotoxicity-related drug attrition. In addition, despite the existence of a substantial body of experimental data, the prediction of aromatic amine mutagenicity still poses a significant challenge for in silico tools. Major contributors to this dilemma are the stability and physicochemical properties of a subset of aromatic amines that affords them capricious mutagenic properties in the Ames test. Such inconsistent mutagenic potential is also compounded by the inherent variability with the assay itself and underscores the need for a rigorous approach in executing the experimental protocol. In order to understand the utility of the in silico approach towards the prediction of pAAs mutagenicity and to widen the availability of mutagenicity data, a group of pharmaceutical companies has formed a consortium with the aim of exchanging their in-house data and making them publicly available for the first time. Summary data compiled during the first phase of this effort is disclosed here and its utility in conjunction with in silico prediction is discussed. Conclusions from this analysis highlight the critical role of expert judgement in rationalizing the experimental activity seen in the Ames test with predictions from in silico models. This collaboration demonstrates the value of sharing such data pre-competitively to aid in both the selection of Ames negative building blocks for drug development while simultaneously helping to develop better in silico tools.