750 research outputs found

    An Ensemble Model of QSAR Tools for Regulatory Risk Assessment

    Get PDF
    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa (κ): 0.63 and 0.62] for both the datasets. The ROC curves demonstrate the utility of the cut-off feature in the predictive ability of the ensemble model. This feature provides an additional control to the regulators in grading a chemical based on the severity of the toxic endpoint under study

    Hybrid Computational Toxicology Models for Regulatory Risk Assessment

    Get PDF
    Computational toxicology is the development of quantitative structure activity relationship (QSAR) models that relate a quantitative measure of chemical structure to a biological effect. In silico QSAR tools are widely accepted as a faster alternative to time-consuming clinical and animal testing methods for regulatory risk assessment of xenobiotics used in consumer products. However, different QSAR tools often make contrasting predictions for a new xenobiotic and may also vary in their predictive ability for different class of xenobiotics. This makes their use challenging, especially in regulatory applications, where transparency and interpretation of predictions play a crucial role in the development of safety assessment decisions. Recent efforts in computational toxicology involve the use of in vitro data, which enables better insight into the mode of action of xenobiotics and identification of potential mechanism(s) of toxicity. To ensure that in silico models are robust and reliable before they can be used for regulatory applications, the registration, evaluation, authorization and restriction of chemicals (REACH) initiative and the organization for economic co-operation and development (OECD) have established legislative guidelines for their validation. This dissertation addresses the limitations in the use of current QSAR tools for regulatory risk assessment within REACH/OECD guidelines. The first contribution is an ensemble model that combines the predictions from four QSAR tools for improving the quality of predictions. The model presents a novel mechanism to select a desired trade-off between false positive and false negative predictions. The second contribution is the introduction of quantitative biological activity relationship (QBAR) models that use mechanistically relevant in vitro data as biological descriptors for development of computational toxicology models. Two novel applications are presented that demonstrate that QBAR models can sufficiently predict carcinogenicity when QSAR model predictions may fail. The third contribution is the development of two novel methods which explore the synergistic use of structural and biological similarity data for carcinogenicity prediction. Two applications are presented that demonstrate the feasibility of proposed methods within REACH/OECD guidelines. These contributions lay the foundation for development of novel mechanism based in silico tools for mechanistically complex toxic endpoints to successfully advance the field of computational toxicology

    Integrating in silico models and read-across methods for predicting toxicity of chemicals: A step-wise strategy

    Get PDF
    Abstract In silico methods and models are increasingly used for predicting properties of chemicals for hazard identification and hazard characterisation in the absence of experimental toxicity data. Many in silico models are available and can be used individually or in an integrated fashion. Whilst such models offer major benefits to toxicologists, risk assessors and the global scientific community, the lack of a consistent framework for the integration of in silico results can lead to uncertainty and even contradictions across models and users, even for the same chemicals. In this context, a range of methods for integrating in silico results have been proposed on a statistical or case-specific basis. Read-across constitutes another strategy for deriving reference points or points of departure for hazard characterisation of untested chemicals, from the available experimental data for structurally-similar compounds, mostly using expert judgment. Recently a number of software systems have been developed to support experts in this task providing a formalised and structured procedure. Such a procedure could also facilitate further integration of the results generated from in silico models and read-across. This article discusses a framework on weight of evidence published by EFSA to identify the stepwise approach for systematic integration of results or values obtained from these "non-testing methods". Key criteria and best practices for selecting and evaluating individual in silico models are also described, together with the means to combining the results, taking into account any limitations, and identifying strategies that are likely to provide consistent results

    Transcriptomics in Toxicogenomics, Part III : Data Modelling for Risk Assessment

    Get PDF
    Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx. Indeed, the publicly available omics datasets are constantly increasing together with a plethora of different methods that are made available to facilitate their analysis, interpretation and the generation of accurate and stable predictive models. In this review, we present the state-of-the-art of data modelling applied to transcriptomics data in TGx. We show how the benchmark dose (BMD) analysis can be applied to TGx data. We review read across and adverse outcome pathways (AOP) modelling methodologies. We discuss how network-based approaches can be successfully employed to clarify the mechanism of action (MOA) or specific biomarkers of exposure. We also describe the main AI methodologies applied to TGx data to create predictive classification and regression models and we address current challenges. Finally, we present a short description of deep learning (DL) and data integration methodologies applied in these contexts. Modelling of TGx data represents a valuable tool for more accurate chemical safety assessment. This review is the third part of a three-article series on Transcriptomics in Toxicogenomics.Peer reviewe

    Review of QSAR Models and Software Tools for Predicting of Genotoxicity and Carcinogenicity

    Get PDF
    This review of QSARs for genotoxicity and carcinogenicity was performed in a broad sense, considering both models available in software tools and models that are published in the literature. The review considered the potential applicability of diverse models to pesticides as well as to other types of regulated chemicals and pharmaceuticals. The availability of models and information on their applicability is summarised in tables, and a range of illustrative or informative examples are described in more detail in the text. In many cases, promising models were identified but they are still at the research stage. For routine application in a regulatory setting, further efforts will be needed to explore the applicability of such models for specific purposes, and to implement them in a practically useful form (i.e. user-friendly software). It is also noted that a range of software tools are research tools suitable for model development, and these require more specialised expertise than other tools that are aimed primarily at end-users such as risk assessors. It is concluded that the most useful models are those which are implemented in software tools and associated with transparent documentation on the model development and validation process. However, it is emphasised that the assessment of model predictions requires a reasonable amount of QSAR knowledge, even if it is not necessary to be a QSAR practitioner.JRC.DG.I.6-Systems toxicolog

    Transcriptomics in Toxicogenomics, Part III: Data Modelling for Risk Assessment

    Get PDF
    Transcriptomics data are relevant to address a number of challenges in Toxicogenomics (TGx). After careful planning of exposure conditions and data preprocessing, the TGx data can be used in predictive toxicology, where more advanced modelling techniques are applied. The large volume of molecular profiles produced by omics-based technologies allows the development and application of artificial intelligence (AI) methods in TGx. Indeed, the publicly available omics datasets are constantly increasing together with a plethora of different methods that are made available to facilitate their analysis, interpretation and the generation of accurate and stable predictive models. In this review, we present the state-of-the-art of data modelling applied to transcriptomics data in TGx. We show how the benchmark dose (BMD) analysis can be applied to TGx data. We review read across and adverse outcome pathways (AOP) modelling methodologies. We discuss how network-based approaches can be successfully employed to clarify the mechanism of action (MOA) or specific biomarkers of exposure. We also describe the main AI methodologies applied to TGx data to create predictive classification and regression models and we address current challenges. Finally, we present a short description of deep learning (DL) and data integration methodologies applied in these contexts. Modelling of TGx data represents a valuable tool for more accurate chemical safety assessment. This review is the third part of a three-article series on Transcriptomics in Toxicogenomics

    Prediction of ADME-Tox properties and toxicological endpoints of triazole fungicides used for cereals protection

    Get PDF
    Within this study we have considered 9 triazole fungicides that are approved to be used in European Union for protecting cereals: cyproconazole, epoxiconazole, flutriafol, metconazole, paclobutrazole, tebuconazole, tetraconazole, triadimenol and triticonazole. We have summarized the few available data that support their effects on humans and used various computational tools to obtain a widely view concerning their possible harmful effects on humans. The results of our predictive study reflect that all triazole fungicides considered in this study reveal good oral bioavailability, are envisaged as being able to penetrate the blood brain barrier and to interact with P-glycoprotein and with hepatic cytochromes. The predictions concerning the toxicological endpoints for the investigated triazole fungicides reveal that they. reflect potential of skin sensitization, of blockage of the hERG K+ channels and of endocrine disruption, that they have not mutagenic potential and their carcinogenic potential is not clear. Epoxiconazole and triadimenol are predicted to have the highest potentials of producing numerous harmful effects on humans and their use should be avoided or limited

    (Q)SAR Modelling of Nanomaterial Toxicity - A Critical Review

    Get PDF
    There is an increasing recognition that nanomaterials pose a risk to human health, and that the novel engineered nanomaterials (ENMs) in the nanotechnology industry and their increasing industrial usage poses the most immediate problem for hazard assessment, as many of them remain untested. The large number of materials and their variants (different sizes and coatings for instance) that require testing and ethical pressure towards non-animal testing means that expensive animal bioassay is precluded, and the use of (quantitative) structure activity relationships ((Q)SAR) models as an alternative source of hazard information should be explored. (Q)SAR modelling can be applied to fill the critical knowledge gaps by making the best use of existing data, prioritize physicochemical parameters driving toxicity, and provide practical solutions to the risk assessment problems caused by the diversity of ENMs. This paper covers the core components required for successful application of (Q)SAR technologies to ENMs toxicity prediction, and summarizes the published nano-(Q)SAR studies and outlines the challenges ahead for nano-(Q)SAR modelling. It provides a critical review of (1) the present status of the availability of ENMs characterization/toxicity data, (2) the characterization of nanostructures that meets the need of (Q)SAR analysis, (3) the summary of published nano-(Q)SAR studies and their limitations, (4) the in silico tools for (Q)SAR screening of nanotoxicity and (5) the prospective directions for the development of nano-(Q)SAR models

    The Use of Computational Methods in the Toxicological Assessment of Chemicals in Food: Current Status and Future Prospects

    Get PDF
    A wide range of chemicals are intentionally added to, or unintentially found in, food products, often in very small amounts. Depending on the situation, the experimental data needed to complete a dietary risk assessment, which is the scientific basis for protecting human health, may not be available or obtainable, for reasons of cost, time and animal welfare. For example, toxicity data are often lacking for the metabolites and degradation products of pesticide active ingredients. There is therefore an interest in the development and application of efficient and effective non-animal methods for assessing chemical toxicity, including Quantitative Structure-Activity Relationship (QSAR) models and related computational methods. This report gives an overview of how computational methods are currently used in the field of food safety by national regulatory bodies, international advisory organisations and the food industry. On the basis of an international survey, a comprehensive literature review and a detailed QSAR analysis, a range of recommendations are made with the long-term aim of promoting the judicious use of suitable QSAR methods. The current status of QSAR methods is reviewed not only for toxicological endpoints relevant to dietary risk assessment, but also for Absorption, Distribution, Metabolism and Excretion (ADME) properties, which are often important in discriminating between the toxicological profiles of parent compounds and their reaction products. By referring to the concept of the Threshold of Toxicological Concern (TTC), the risk assessment context in which QSAR methods can be expected to be used is also discussed. This Joint Research Centre (JRC) Reference Report provides a summary and update of the findings obtained in a study carried out by the JRC under the terms of a contract awarded by the European Food Safety Authority (EFSA).JRC.DG.I.6-Systems toxicolog

    A Framework for assessing in silico Toxicity Predictions: Case Studies with selected Pesticides

    Get PDF
    In the regulatory assessment of chemicals, the use of in silico prediction methods such as (quantitative) structure-activity relationship models ([Q]SARs), is increasingly required or encouraged, in order to increase the efficiency and effectiveness of the risk assessment process, and to minimise the reliance on animal testing. The main question for the assessor concerns the usefulness of the prediction approach, which can be broken down into the practical applicability of the method and the adequacy of the predictions. A framework for assessing and documenting (Q)SAR models and their predictions has been established at the European and international levels. Exactly how the framework is applied in practice will depend on the provisions of the specific legislation and the context in which the non-testing data are being used. This report describes the current framework for documenting (Q)SAR models and their predictions, and discuses how it might be built upon to provide more detailed guidance on the use of (Q)SAR predictions in regulatory decision making. The proposed framework is illustrated by using selected pesticide active compounds as examples.JRC.DG.I.6-Systems toxicolog
    corecore