The starting point of successful hazard assessment is the generation of unbiased and trustworthy data. Conventional toxicity testing deals with extensive observations of phenotypic endpoints in vivo ...and complementing in vitro models. The increasing development of novel materials and chemical compounds dictates the need for a better understanding of the molecular changes occurring in exposed biological systems. Transcriptomics enables the exploration of organisms' responses to environmental, chemical, and physical agents by observing the molecular alterations in more detail. Toxicogenomics integrates classical toxicology with omics assays, thus allowing the characterization of the mechanism of action (MOA) of chemical compounds, novel small molecules, and engineered nanomaterials (ENMs). Lack of standardization in data generation and analysis currently hampers the full exploitation of toxicogenomics-based evidence in risk assessment. To fill this gap, TGx methods need to take into account appropriate experimental design and possible pitfalls in the transcriptomic analyses as well as data generation and sharing that adhere to the FAIR (Findable, Accessible, Interoperable, and Reusable) principles. In this review, we summarize the recent advancements in the design and analysis of DNA microarray, RNA sequencing (RNA-Seq), and single-cell RNA-Seq (scRNA-Seq) data. We provide guidelines on exposure time, dose and complex endpoint selection, sample quality considerations and sample randomization. Furthermore, we summarize publicly available data resources and highlight applications of TGx data to understand and predict chemical toxicity potential. Additionally, we discuss the efforts to implement TGx into regulatory decision making to promote alternative methods for risk assessment and to support the 3R (reduction, refinement, and replacement) concept. This review is the first part of a three-article series on Transcriptomics in Toxicogenomics. These initial considerations on Experimental Design, Technologies, Publicly Available Data, Regulatory Aspects, are the starting point for further rigorous and reliable data preprocessing and modeling, described in the second and third part of the review series.
Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in ...predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx. Indeed, the publicly available omics datasets are constantly increasing together with a plethora of different methods that are made available to facilitate their analysis, interpretation and the generation of accurate and stable predictive models. In this review, we present the state-of-the-art of data modelling applied to transcriptomics data in TGx. We show how the benchmark dose (BMD) analysis can be applied to TGx data. We review read across and adverse outcome pathways (AOP) modelling methodologies. We discuss how network-based approaches can be successfully employed to clarify the mechanism of action (MOA) or specific biomarkers of exposure. We also describe the main AI methodologies applied to TGx data to create predictive classification and regression models and we address current challenges. Finally, we present a short description of deep learning (DL) and data integration methodologies applied in these contexts. Modelling of TGx data represents a valuable tool for more accurate chemical safety assessment. This review is the third part of a three-article series on Transcriptomics in Toxicogenomics.
This paper outlines the work for which Roland Grafström and Pekka Kohonen were awarded the 2014 Lush Science Prize. The research activities of the Grafström laboratory have, for many years, covered ...cancer biology studies, as well as the development and application of toxicity-predictive in vitro models to determine chemical safety. Through the integration of in silico analyses of diverse types of genomics data (transcriptomic and proteomic), their efforts have proved to fit well into the recently-developed Adverse Outcome Pathway paradigm. Genomics analysis within state-of-the-art cancer biology research and Toxicology in the 21st Century concepts share many technological tools. A key category within the Three Rs paradigm is the Replacement of animals in toxicity testing with alternative methods, such as bioinformatics-driven analyses of data obtained from human cell cultures exposed to diverse toxicants. This work was recently expanded within the pan-European SEURAT-1 project (Safety Evaluation Ultimately Replacing Animal Testing), to replace repeat-dose toxicity testing with data-rich analyses of sophisticated cell culture models. The aims and objectives of the SEURAT project have been to guide the application, analysis, interpretation and storage of 'omics' technology-derived data within the service-oriented sub-project, ToxBank. Particularly addressing the Lush Science Prize focus on the relevance of toxicity pathways, a 'data warehouse' that is under continuous expansion, coupled with the development of novel data storage and management methods for toxicology, serve to address data integration across multiple 'omics' technologies. The prize winners' guiding principles and concepts for modern knowledge management of toxicological data are summarised. The translation of basic discovery results ranged from chemical-testing and material-testing data, to information relevant to human health and environmental safety.
Preprocessing of transcriptomics data plays a pivotal role in the development of toxicogenomics-driven tools for chemical toxicity assessment. The generation and exploitation of large volumes of ...molecular profiles, following an appropriate experimental design, allows the employment of toxicogenomics (TGx) approaches for a thorough characterisation of the mechanism of action (MOA) of different compounds. To date, a plethora of data preprocessing methodologies have been suggested. However, in most cases, building the optimal analytical workflow is not straightforward. A careful selection of the right tools must be carried out, since it will affect the downstream analyses and modelling approaches. Transcriptomics data preprocessing spans across multiple steps such as quality check, filtering, normalization, batch effect detection and correction. Currently, there is a lack of standard guidelines for data preprocessing in the TGx field. Defining the optimal tools and procedures to be employed in the transcriptomics data preprocessing will lead to the generation of homogeneous and unbiased data, allowing the development of more reliable, robust and accurate predictive models. In this review, we outline methods for the preprocessing of three main transcriptomic technologies including microarray, bulk RNA-Sequencing (RNA-Seq), and single cell RNA-Sequencing (scRNA-Seq). Moreover, we discuss the most common methods for the identification of differentially expressed genes and to perform a functional enrichment analysis. This review is the second part of a three-article series on Transcriptomics in Toxicogenomics.
Formaldehyde dehydrogenase, formally Class III alcohol dehydrogenase (ADH3), has recently been discovered to partially regulate nitrosothiol homeostasis by catalyzing the reduction of the endogenous ...nitrosylating agent S-nitrosoglutathione (GSNO). Several studies have implicated this enzyme, and in particular GSNO reduction, as playing an important role in conditions such as asthma, cardiovascular disease, and immune function. While ADH3 has received considerable attention in the biomedical literature where it is often referred to as GSNO reductase (GSNOR), ADH3-mediated GSNO reduction has received comparatively less attention in the environmental toxicology community. Herein, evidences for a role of ADH3 in cell signaling through thiol homeostasis is highlighted, underscoring that the enzyme functions more broadly than to metabolize formaldehyde.
The minimum information requirements needed to guarantee high-quality surface analysis data of nanomaterials are described with the aim to provide reliable and traceable information about size, ...shape, elemental composition and surface chemistry for risk assessment approaches. The widespread surface analysis methods electron microscopy (SEM), energy dispersive X-ray spectroscopy (EDS), X-ray photoelectron spectroscopy (XPS) and secondary ion mass spectrometry (SIMS) were considered. The complete analysis sequence from sample preparation, over measurements, to data analysis and data format for reporting and archiving is outlined. All selected methods are used in surface analysis since many years so that many aspects of the analysis (including (meta)data formats) are already standardized. As a practical analysis use case, two coated TiO
reference nanoparticulate samples, which are available on the Joint Research Centre (JRC) repository, were selected. The added value of the complementary analysis is highlighted based on the minimum information requirements, which are well-defined for the analysis methods selected. The present paper is supposed to serve primarily as a source of understanding of the high standardization level already available for the high-quality data in surface analysis of nanomaterials as reliable input for the nanosafety community.
A future-proof multifaceted framework and its elements (‘creating SIA awareness’, ‘developing SIA methodology, ‘bringing Trusted Environment (TE) concept into an operational level’, and ‘developing ...new business and governance models’) needed for the implementation of SIA
Display omitted
•The Safe Innovation Approach (SIA) is a framework outlining the necessary elements needed to achieve safe(r) nanomaterials and nano-enabled products.•SIA consists of the combination of the safe-by-design (SbD) and regulatory preparedness (RP) concept; as both industry/innovators and regulators need to be proactive and vigilant.•SbD recommends industry to integrate safety considerations as early as possible in the innovation process.•The SIA framework is an agile and robust risk assessment system for nanomaterials and nano-enabled products that is currently being brought to practice internationally.
Nanotechnologies are characterized by a growing legacy of already marketed and novel manufactured nanomaterials (MNMs) and nano-enabled products with a lack of a coherent risk governance system to address their safety effectively. In response to this situation, a proactive system is needed to minimize the gap between the pace of innovation and the pace of developing nano-specific risk governance. With the Safe Innovation Approach (SIA), we seek to enhance the ability of all stakeholders to address the safety assessment of innovations in a robust yet agile manner. The SIA is an approach that combines a) the Safe-by-Design (SbD) concept, which recommends industry to integrate safety considerations as early as possible into the innovation process, and b) the Regulatory Preparedness (RP) concept which aims to improve anticipation of regulators in order that they can facilitate the development of adaptable (safety) regulation that can keep up with the pace of knowledge generation and innovation of MNMs and MNM-enabled products. SIA promotes a safe and responsible approach for industry when developing innovative products and materials, and stimulates a proactive attitude amongst policymakers and regulators to minimize the time gap between appearance and approval of innovation and appropriate legislation. Here we introduce a SIA framework consisting of creating SIA awareness, developing a SIA methodology (SbD scenarios, SbD methodology including information needs, functionality, and grouping, SIA Toolbox and a nano-specific database), bringing the Trusted Environment and RP concept into an operational level, and the development of novel business for industry and novel governance models for regulators. The SIA framework once implemented will result in a system for MNMs and nano-enabled products that is agile and robust. Current international efforts such as in the OECD are now trying to bring this concept to practice.
Occupational exposure to formaldehyde has been linked to nasopharyngeal carcinoma. To date, mechanistic explanations for this association have primarily focused on formaldehyde-induced cytotoxicity, ...regenerative hyperplasia and DNA damage. However, recent studies broaden the potential mechanisms as it is now well established that formaldehyde dehydrogenase, identical to S-nitrosoglutathione reductase, is an important mediator of cGMP-independent nitric oxide signaling pathways. We have previously described mechanisms by which formaldehyde can influence nitrosothiol homeostasis thereby leading to changes in pulmonary physiology. Considering evidences that nitrosothiols govern the Epstein-Barr virus infection cycle, and that the virus is strongly implicated in the etiology of nasopharyngeal carcinoma, studies are needed to examine the potential for formaldehyde to reactivate the Epstein-Barr virus as well as additively or synergistically interact with the virus to potentiate epithelial cell transformation.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK