NMR and MS Methods for Metabonomics Dieterle, Frank; Riefke, Björn; Schlotterbeck, Götz ...
Drug Safety Evaluation,
2011, Letnik:
691
Book Chapter, Journal Article
Metabonomics, also often referred to as “metabolomics” or “metabolic profiling,” is the systematic profiling of metabolites in bio-fluids or tissues of organisms and their temporal changes. In the ...last decade, metabonomics has become increasingly popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other “-omics” technologies. The increasing popularity of metabonomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabonomics, i.e., NMR, LC-MS, UPLC-MS, and GC-MS have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabonomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation, to determining the measurement details of all analytical platforms, and finally, to discussing the corresponding specific steps of data analysis.
•Application of ICH M7 recommendations to structure-based assessements.•Place/use of in silico methodologies (rule-based and (Q)SAR).•Clear definition and scope of expert knowledge.•Current industry ...practice in major EFPIA and PhRMA companies.•Industry surveys comparing methodologies complementing the rule-based system.
Genotoxicity hazard identification is part of the impurity qualification process for drug substances and products, the first step of which being the prediction of their potential DNA reactivity using in silico (quantitative) structure–activity relationship (Q)SAR models/systems. This white paper provides information relevant to the development of the draft harmonized tripartite guideline ICH M7 on potentially DNA-reactive/mutagenic impurities in pharmaceuticals and their application in practice. It explains relevant (Q)SAR methodologies as well as the added value of expert knowledge. Moreover, the predictive value of the different methodologies analyzed in two surveys conveyed in the US and European pharmaceutical industry is compared: most pharmaceutical companies used a rule-based expert system as their primary methodology, yielding negative predictivity values of ⩾78% in all participating companies. A further increase (>90%) was often achieved by an additional expert review and/or a second QSAR methodology. Also in the latter case, an expert review was mandatory, especially when conflicting results were obtained. Based on the available data, we concluded that a rule-based expert system complemented by either expert knowledge or a second (Q)SAR model is appropriate. A maximal transparency of the assessment process (e.g. methods, results, arguments of weight-of-evidence approach) achieved by e.g. data sharing initiatives and the use of standards for reporting will enable regulators to fully understand the results of the analysis. Overall, the procedures presented here for structure-based assessment are considered appropriate for regulatory submissions in the scope of ICH M7.
Historical data from control groups in animal toxicity studies are currently mainly used for comparative purposes to assess validity and robustness of study results. Due to the highly controlled ...environment in which the studies are performed and the homogeneity of the animal collectives it has been proposed to use the historical data to build so-called virtual control groups, which could partly or entirely replace the concurrent control group. This would constitute a substantial contribution to the reduction of animal use in safety studies. Before the concept can be implemented, the prerequisites regarding data collection, curation, and statistical evaluation together with a validation strategy need to be identified to avoid any impairment of the study outcome and subsequent consequences for human risk assessment. To further assess and develop the concept of virtual control groups, the transatlantic think tank for toxicology (t.sup.4) sponsored a workshop with stakeholders from the pharmaceutical and chemical industry, academia, FDA, contract research organizations (CROs), and non-governmental organizations in Washington, which took place in March 2023. This report summarizes the current efforts of a European initiative to share, collect, and curate animal control data in a centralized database and the first approaches to identify optimal matching criteria between virtual controls and the treatment arms of a study as well as first reflections about strategies for a qualification procedure and potential pitfalls of the concept.
Sharing legacy data from in vivo toxicity studies offers the opportunity to analyze the variability of control groups stratified for strain, age, duration of study, vehicle and other experimental ...conditions. Historical animal control group data may lead to a repository, which could be used to construct virtual control groups (VCGs) for toxicity studies. VCGs are an established concept in clinical trials, but the idea of replacing living beings with virtual data sets has so far not been introduced into the design of regulatory animal studies. The use of VCGs has the potential of a 25% reduction in animal use by replacing the control group animals with existing randomized data sets. Prerequisites for such an approach are the availability of large and well-structured control data sets as well as thorough statistical evaluations. the foundation of data sharing has been laid within the Innovative Medicines Initiatives projects eTOX and eTRANSAFE. For a proof of principle participating companies have started to collect control group data for subacute (4-week) GLP studies with Wistar rats (the strain preferentially used in Europe) and are characterizing these data for its variability. In a second step, the control group data will be shared among the companies and cross-company variability will be investigated. In a third step, a set of studies will be analyzed to assess whether the use of VCG data would have influenced the outcome of the study compared to the real control group.
For fasiglifam (TAK875) and its metabolites the substance‐specific mechanisms of liver toxicity were studied. Metabolism studies were run to identify a putatively reactive acyl glucuronide ...metabolite. In vitro cytotoxicity and caspase 3/7 activation were assessed in primary human and dog hepatocytes in 2D and 3D cell culture. Involvement of glutathione (GSH) detoxication system in mediating cytotoxicity was determined by assessing potentiation of cytotoxicity in a GSH depleted in vitro system. In addition, potential mitochondrial liabilities of the compounds were assessed in a whole‐cell mitochondrial functional assay. Fasiglifam showed moderate cytotoxicity in human primary hepatocytes in the classical 2D cytotoxicity assays and also in the complex 3D human liver microtissue (hLiMT) after short‐term treatment (24 hours or 48 hours) with TC50 values of 56 to 68 µM (adenosine triphosphate endpoint). The long‐term treatment for 14 days in the hLiMT resulted in a slight TC50 shift over time of 2.7/3.6 fold lower vs 24‐hour treatment indicating possibly a higher risk for cytotoxicity during long‐term treatment. Cellular GSH depletion and impairment of mitochondrial function by TAK875 and its metabolites evaluated by Seahorse assay could not be found being involved in DILI reported for TAK875. The acyl glucuronide metabolites of TAK875 have been finally identified to be the dominant reason for liver toxicity.
Abstract
Over the past decades, pharmaceutical companies have conducted a large number of high-quality in vivo repeat-dose toxicity (RDT) studies for regulatory purposes. As part of the eTOX project, ...a high number of these studies have been compiled and integrated into a database. This valuable resource can be queried directly, but it can be further exploited to build predictive models. As the studies were originally conducted to investigate the properties of individual compounds, the experimental conditions across the studies are highly heterogeneous. Consequently, the original data required normalization/standardization, filtering, categorization and integration to make possible any data analysis (such as building predictive models). Additionally, the primary objectives of the RDT studies were to identify toxicological findings, most of which do not directly translate to in vivo endpoints. This article describes a method to extract datasets containing comparable toxicological properties for a series of compounds amenable for building predictive models. The proposed strategy starts with the normalization of the terms used within the original reports. Then, comparable datasets are extracted from the database by applying filters based on the experimental conditions. Finally, carefully selected profiles of toxicological findings are mapped to endpoints of interest, generating QSAR-like tables. In this work, we describe in detail the strategy and tools used for carrying out these transformations and illustrate its application in a data sample extracted from the eTOX database. The suitability of the resulting tables for developing hazard-predicting models was investigated by building proof-of-concept models for in vivo liver endpoints.
Abstract
(Quantitative) structure–activity relationship or (Q)SAR predictions of DNA-reactive mutagenicity are important to support both the design of new chemicals and the assessment of impurities, ...degradants, metabolites, extractables and leachables, as well as existing chemicals. Aromatic N-oxides represent a class of compounds that are often considered alerting for mutagenicity yet the scientific rationale of this structural alert is not clear and has been questioned. Because aromatic N-oxide-containing compounds may be encountered as impurities, degradants and metabolites, it is important to accurately predict mutagenicity of this chemical class. This article analysed a series of publicly available aromatic N-oxide data in search of supporting information. The article also used a previously developed structure–activity relationship (SAR) fingerprint methodology where a series of aromatic N-oxide substructures was generated and matched against public and proprietary databases, including pharmaceutical data. An assessment of the number of mutagenic and non-mutagenic compounds matching each substructure across all sources was used to understand whether the general class or any specific subclasses appear to lead to mutagenicity. This analysis resulted in a downgrade of the general aromatic N-oxide alert. However, it was determined there were enough public and proprietary data to assign the quindioxin and related chemicals as well as benzoc1,2,5oxadiazole 1-oxide subclasses as alerts. The overall results of this analysis were incorporated into Leadscope’s expert-rule-based model to enhance its predictive accuracy.