5,779 research outputs found

    Hepatic microRNA profiles offer predictive and mechanistic insights after exposure to genotoxic and epigenetic hepatocarcinogens.

    Get PDF
    In recent years, accumulating evidence supports the importance of microRNAs in liver physiology and disease; however, few studies have examined the involvement of these noncoding genes in chemical hepatocarcinogenesis. Here, we examined the liver microRNA profile of male Fischer rats exposed through their diet to genotoxic (2-acetylaminofluorene) and epigenetic (phenobarbital, diethylhexylphthalate, methapyrilene HCL, monuron, and chlorendic acid) chemical hepatocarcinogens, as well as to non-hepatocarcinogenic treatments (benzophenone, and diethylthiourea) for 3 months. The effects of these treatments on liver pathology, plasma clinical parameters, and liver mRNAs were also determined. All hepatocarcinogens affected the expression of liver mRNAs, while the hepatic microRNA profiles were associated with the mode of action of the chemical treatments and corresponded to chemical carcinogenicity. The three nuclear receptor-activating chemicals (phenobarbital, benzophenone, and diethylhexylphthalate) were characterized by the highly correlated induction of the miR-200a/200b/429, which is involved in protecting the epithelial status of cells and of the miR-96/182 clusters. The four non-nuclear receptor-activating hepatocarcinogens were characterized by the early, persistent induction of miR-34, which was associated with DNA damage and oxidative stress in vivo and in vitro. Repression of this microRNA in a hepatoma cell line led to increased cell growth; thus, miR-34a could act to block abnormal cell proliferation in cells exposed to DNA damage or oxidative stress. This study supports the proposal that hepatic microRNA profiles could assist in the earlier evaluation and identification of hepatocarcinogens, especially those acting by epigenetic mechanisms. © The Author 2012. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved

    Toxicity Pathways – from concepts to application in chemical safety assessment

    Get PDF
    Few would deny that the NRC report (NRC, 2007), "Toxicity Testing in the 21st Century: A Vision and Strategy”, represented a re-orientation of thinking surrounding the risk assessment of environmental chemicals. The key take-home message was that by understanding Toxicity Pathways (TP) we could profile the potential hazard and assess risks to humans and the environment using intelligent combinations of computational and in vitro methods. In theory at least, shifting to this new paradigm promises more efficient, comprehensive and cost effective testing strategies for every chemical in commerce while minimising the use of animals. For those of us who embrace the vision and the strategy proposed to achieve it, attention has increasingly focused on how we can actually practice what we preach. For a start, 21st century concepts described in the report have to be carefully interpreted and then translated into processes that essentially define and operationalize a TP framework for chemical risk assessment. In September 2011 the European Commission's Joint Research Centre (JRC) and the Hamner Institutes for Health Sciences co-organised a "Toxicity Pathways" workshop. It was hosted by the JRC and took place in Ispra, Italy. There were 23 invited participants with more or less equal representation from Europe and North America. The purpose of the meeting was to address three key questions surrounding a TP based approach to chemical risk assessment, namely – What constitutes a TP? How can we use TPs to develop in vitro assays and testing strategies? And, How can the results from TP testing be used in human health risk assessments? The meeting ran over two days and comprised a series of thought-starter presentations, breakout sessions and plenty of group discussions. The outcome was captured by rapporteurs and compiled as a workshop report which is available for download (without charge) from the JRC website. Here we expand on selected deliberations of the workshop to illustrate how TP thinking is still evolving and to indicate what pieces of the puzzle still need to fall into place before TP based risk assessment can become a reality.JRC.I.5-Systems Toxicolog

    Collaborative development of predictive toxicology applications

    Get PDF
    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals

    Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data

    Get PDF
    Background: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with gene set analysis (GSA) methods for chemical treatment identification, for pharmacological mechanism elucidation, and for comparing compound toxicity profiles. Methods. We created 30,211 chemical response-specific gene sets for human and mouse by next-gen TM, and derived 1,189 (human) and 588 (mouse) gene sets from the Comparative Toxicogenomics Database (CTD). We tested for significant differential expression (SDE) (false discovery rate -corrected p-values < 0.05) of the next-gen TM-derived gene sets and the CTD-derived gene sets in gene expression (GE) data sets of five chemicals (from experimental models). We tested for SDE of gene sets for six fibrates in a peroxisome proliferator-activated receptor alpha (PPARA) knock-out GE dataset and compared to results from the Connectivity Map. We tested for SDE of 319 next-gen TM-derived gene sets for environmental toxicants in three GE data sets of triazoles, and tested for SDE of 442 gene sets associated with embryonic structures. We compared the gene sets to triazole effects seen in the Whole Embryo Culture (WEC), and used principal component analysis (PCA) to discriminate triazoles from other chemicals. Results: Next-gen TM-derived gene sets matching the chemical treatment were significantly altered in three GE data sets, and the corresponding CTD-derived gene sets were significantly altered in five GE data sets. Six next-gen TM-derived and four CTD-derived fibrate gene sets were significantly altered in the PPARA knock-out GE dataset. None of the fibrate signatures in cMap scored significant against the PPARA GE signature. 33 environmental toxicant gene sets were significantly altered in the triazole GE data sets. 21 of these toxicants had a similar toxicity pattern as the triazoles. We confirmed embryotoxic effects, and discriminated triazoles from other chemicals. Conclusions: Gene set analysis with next-gen TM-derived chemical response-specific gene sets is a scalable method for identifying similarities in gene responses to other chemicals, from which one may infer potential mode of action and/or toxic effect

    A mode-of-action ontology model for safety evaluation of chemicals: outcome of a series of workshops on repeated dose toxicity

    Get PDF
    Repeated dose toxicity evaluation aims at assessing the occurrence of adverse effects following chronic or repeated exposure to chemicals. Non-animal approaches have gained importance in the last decades because of ethical considerations as well as due to scientific reasons calling for more human-based strategies. A critical aspect of this challenge is linked to the capacity to cover a comprehensive set of interdependent mechanisms of action, link them to adverse effects and interpret their probability to be triggered in the light of the exposure at the (sub)cellular level. Inherent to its structured nature, an ontology addressing repeated dose toxicity could be a scientific and transparent way to achieve this goal. Additionally, repeated dose toxicity evaluation through the use of a harmonized ontology should be performed in a reproducible and consistent manner, while mimicking as accurately as possible human physiology and adaptivity. In this paper, the outcome of a series of workshops organized by Cosmetics Europe on this topic is reported. As such, this manuscript shows how experts set critical elements and ways of establishing a mode-of-action ontology model as a support to risk assessors aiming to perform animal-free safety evaluation of chemicals based on repeated dose toxicity data

    The usefulness of integrated strategy approaches in replacing animal experimentation

    Get PDF
    It is currently well accepted that in general, more than one method is necessary to allow the full replacement of an animal experimentation. These so called partial replacement methods can be used within integrated strategy approaches that combine different methods and information sources. A number of integrated strategy approaches were implemented within recent years in different areas of safety and regulatory toxicology.Moreover, latest advances in biomedical research and bioengineering provide a major opportunity to make use of in vitro human-based and/or three-dimensional complex models that can contribute to achieve more physiologically-relevant models. Examples herein describe currently existing integrated strategy frameworks aiming at full or partial replacement purposes and/or at gaining mechanistic insights. Furthermore, a general concept is provided on how 3R methods might be integrated in a strategy approach in order to ensure that animal experimentation is conducted only as a last resort
    corecore