32 research outputs found

    t4 Workshop Report: Integrated Testing Strategies (ITS) for Safety Assessment

    Get PDF
    Integrated testing strategies (ITS), as opposed to single definitive tests or fixed batteries of tests, are expected to efficiently combine different information sources in a quantifiable fashion to satisfy an information need, in this case for regulatory safety assessments. With increasing awareness of the limitations of each individual tool and the development of highly targeted tests and predictions, the need for combining pieces of evidence increases. The discussions that took place during this workshop, which brought together a group of experts coming from different related areas, illustrate the current state of the art of ITS, as well as promising developments and identifiable challenges. The case of skin sensitization was taken as an example to understand how possible ITS can be constructed, optimized and validated. This will require embracing and developing new concepts such as adverse outcome pathways (AOP), advanced statistical learning algorithms and machine learning, mechanistic validation and “Good ITS Practices”.JRC.I.5-Systems Toxicolog

    Physiological maps and chemical-induced disease ontologies: tools to support NAMs development for next-generation risk assessment

    Full text link
    editorial reviewedPhysiological maps (PM) can be defined as a graphical representation of cellular and molecular processes associated to specific organ functions (Vinken et al. 2021). Within the ONTOX project, we designed a total of 6 PMs describing physiological processes in the liver, the kidney and the brain. These PMs are then used as a tool to assess relevant mechanistic coverage and linkage between a specific organ function and a toxicological endpoint. Based on the Disease Maps project (Mazein et al. 2018) pipeline, we developed the first version of 6 PMs describing the following physiological processes: bile secretion & lipid metabolism (liver), vitamin D metabolism & urine composition (kidney), neural tube closure (update of the work of Heusinkveld et al. 2021) & brain development (brain). Our workflow included: (i) data collection from expert curated literature (ii) identification of the relevant biological mechanisms, (iii) screening of online databases (e.g. Wikipathways, Reactome, and KEGG) for previously described pathways, (iv) manual curation and integration of the data into a PM using CellDesigner, and (v) visualization on the MINERVA platform (Hoksza et al. 2019). These qualitative PMs represent an important tool for exploring curated literature, analyzing networks and benchmarking the development of new adverse outcome pathways (AOPs). These PMs provide the basis for developing quantitative disease ontologies, integrating different layers of pathological and toxicological information, chemical information (drug-induced pathways) and kinetic data. The resulting chemical-induced disease ontologies will provide a multi-layered platform for integration and visualization of such information. The ontologies will contribute to improving understanding of organ/disease related pathways in response to chemicals, visualize omics datasets, develop quantitative methods for computational disease modeling and for predicting toxicity, set up an in vitro & in silico test battery to detect a specific type of toxicity, and develop new animal-free approaches for next generation risk assessment

    CATMoS: Collaborative Acute Toxicity Modeling Suite.

    Get PDF
    BACKGROUND: Humans are exposed to tens of thousands of chemical substances that need to be assessed for their potential toxicity. Acute systemic toxicity testing serves as the basis for regulatory hazard classification, labeling, and risk management. However, it is cost- and time-prohibitive to evaluate all new and existing chemicals using traditional rodent acute toxicity tests. In silico models built using existing data facilitate rapid acute toxicity predictions without using animals. OBJECTIVES: The U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) Acute Toxicity Workgroup organized an international collaboration to develop in silico models for predicting acute oral toxicity based on five different end points: Lethal Dose 50 (LD50 value, U.S. Environmental Protection Agency hazard (four) categories, Globally Harmonized System for Classification and Labeling hazard (five) categories, very toxic chemicals [LD50 (LD50≤50mg/kg)], and nontoxic chemicals (LD50>2,000mg/kg). METHODS: An acute oral toxicity data inventory for 11,992 chemicals was compiled, split into training and evaluation sets, and made available to 35 participating international research groups that submitted a total of 139 predictive models. Predictions that fell within the applicability domains of the submitted models were evaluated using external validation sets. These were then combined into consensus models to leverage strengths of individual approaches. RESULTS: The resulting consensus predictions, which leverage the collective strengths of each individual model, form the Collaborative Acute Toxicity Modeling Suite (CATMoS). CATMoS demonstrated high performance in terms of accuracy and robustness when compared with in vivo results. DISCUSSION: CATMoS is being evaluated by regulatory agencies for its utility and applicability as a potential replacement for in vivo rat acute oral toxicity studies. CATMoS predictions for more than 800,000 chemicals have been made available via the National Toxicology Program's Integrated Chemical Environment tools and data sets (ice.ntp.niehs.nih.gov). The models are also implemented in a free, standalone, open-source tool, OPERA, which allows predictions of new and untested chemicals to be made. https://doi.org/10.1289/EHP8495

    Computational Approaches to Chemical Hazard Assessment

    No full text
    Ancient humans memorized which berries induced nausea and avoided them. We recorded chemical poisons in the Ebers Papyrus as far back as 1500 BC. Modern humans have moved their recordings from papyrus to databases and learned a few tricks along the way. Chemical databases govern how we transport, eat, work with, and interact with chemicals every day. The attached publications discuss some of the current challenges and opportunities facing the intersection of information science and chemistry - cheminformatics. This thesis presents 7 publications broken into 4 high level categories: 1. Computational approaches to chemical hazard assessment: A discussion of the packages, statistics, features and development in computational toxicology. 2. Dose reponse modeling: An evaluation of hidden markov models used with random forests to address dose response data. 3. Evaluation of regulatory data from REACH: The bulk of this thesis gives four publications involving work done to organize regulatory documents into a machine readable database. This database was then used to demonstrate many approaches to chemical health hazard modeling. 4. Environmental Health consequences: A discussion of the promise and pitfalls for computational toxicology on risk assessment particularly in a regulatory context. The first publication below is an introduction and discussion of the major topics in computational toxicology

    Computational approaches to chemical hazard assessment

    No full text
    Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models.publishe

    Computational Approaches to Chemical Hazard Assessment

    No full text
    Ancient humans memorized which berries induced nausea and avoided them. We recorded chemical poisons in the Ebers Papyrus as far back as 1500 BC. Modern humans have moved their recordings from papyrus to databases and learned a few tricks along the way. Chemical databases govern how we transport, eat, work with, and interact with chemicals every day. The attached publications discuss some of the current challenges and opportunities facing the intersection of information science and chemistry - cheminformatics. This thesis presents 7 publications broken into 4 high level categories: 1. Computational approaches to chemical hazard assessment: A discussion of the packages, statistics, features and development in computational toxicology. 2. Dose reponse modeling: An evaluation of hidden markov models used with random forests to address dose response data. 3. Evaluation of regulatory data from REACH: The bulk of this thesis gives four publications involving work done to organize regulatory documents into a machine readable database. This database was then used to demonstrate many approaches to chemical health hazard modeling. 4. Environmental Health consequences: A discussion of the promise and pitfalls for computational toxicology on risk assessment particularly in a regulatory context. The first publication below is an introduction and discussion of the major topics in computational toxicology

    Big-Data and Machine Learning to Revamp Computational Toxicology and its Use in Risk Assessment

    No full text
    The creation of large toxicological databases and advances in machine-learning techniques have empowered computational approaches in toxicology. Work with these large databases based on regulatory data has allowed reproducibility assessment of animal models, which highlight weaknesses in traditional in vivo methods. This should lower the bars for the introduction of new approaches and represents a benchmark what is achievable for any alternative method validated against these methods. Quantitative Structure Activity Relationships (QSAR) models for skin sensitization, eye irritation, and other human health hazards based on these big databases, however, also have made apparent some of the challenges facing computational modeling, including validation challenges, model interpretation issues, and model selection issues. A first implementation of machine learning-based predictions termed REACHacross achieved unprecedented sensitivities of >80% with specificities >70% in predicting the six most common acute and topical hazards covering about two thirds of the chemical universe. While this is awaiting formal validation, it demonstrates the new quality introduced by big data and modern data-mining technologies. The rapid increase in the diversity and number of computational models, as well as the data they are based on, create challenges and opportunities for the use of computational methods.publishe

    Machine Learning of Toxicological Big Data Enables Read-Across Structure Activity Relationships (RASAR) Outperforming Animal Test Reproducibility

    No full text
    Earlier we created a chemical hazard database via natural language processing of dossiers submitted to the European Chemical Agency with approximately 10 000 chemicals. We identified repeat OECD guideline tests to establish reproducibility of acute oral and dermal toxicity, eye and skin irritation, mutagenicity and skin sensitization. Based on 350–700+ chemicals each, the probability that an OECD guideline animal test would output the same result in a repeat test was 78%–96% (sensitivity 50%–87%). An expanded database with more than 866 000 chemical properties/hazards was used as training data and to model health hazards and chemical properties. The constructed models automate and extend the read-across method of chemical classification. The novel models called RASARs (read-across structure activity relationship) use binary fingerprints and Jaccard distance to define chemical similarity. A large chemical similarity adjacency matrix is constructed from this similarity metric and is used to derive feature vectors for supervised learning. We show results on 9 health hazards from 2 kinds of RASARs—“Simple” and “Data Fusion”. The “Simple” RASAR seeks to duplicate the traditional read-across method, predicting hazard from chemical analogs with known hazard data. The “Data Fusion” RASAR extends this concept by creating large feature vectors from all available property data rather than only the modeled hazard. Simple RASAR models tested in cross-validation achieve 70%–80% balanced accuracies with constraints on tested compounds. Cross validation of data fusion RASARs show balanced accuracies in the 80%–95% range across 9 health hazards with no constraints on tested compounds.publishe

    Integrated testing strategies for safety assessments

    No full text
    Despite the fact that toxicology uses many stand-alone tests, a systematic combination of several information sources very often is required: Examples include: when not all possible outcomes of interest (e.g., modes of action), classes of test substances (applicability domains), or severity classes of effect are covered in a single test; when the positive test result is rare (low prevalence leading to excessive falsepositive results); when the gold standard test is too costly or uses too many animals, creating a need for prioritization by screening. Similarly, tests are combined when the human predictivity of a single test is not satisfactory or when existing data and evidence from various tests will be integrated. Increasingly, kinetic information also will be integrated to make an in vivo extrapolation from in vitro data.Integrated Testing Strategies (ITS) offer the solution to these problems. ITS have been discussed for more than a decade, and some attempts have been made in test guidance for regulations. Despite their obvious potential for revamping regulatory toxicology, however, we still have little guidance on the composition, validation, and adaptation of ITS for different purposes. Similarly, Weight of Evidence and Evidence-based Toxicology approaches require different pieces of evidence and test data to be weighed and combined. ITS also represent the logical way of combining pathway-based tests, as suggested in Toxicology for the 21st Century. This paper describes the state of the art of ITS and makes suggestions as to the definition, systematic combination, and quality assurance of ITS.publishe
    corecore