1,560 research outputs found

    Development of a hybrid Bayesian network model for predicting acute fish toxicity using multiple lines of evidence

    Get PDF
    A hybrid Bayesian network (BN) was developed for predicting the acute toxicity of chemicals to fish, using data from fish embryo toxicity (FET) testing in combination with other information. This model can support the use of FET data in a Weight-of-Evidence (WOE) approach for replacing the use of ju-venile fish. The BN predicted correct toxicity intervals for 69%–80% of the tested substances. The model was most sensitive to components quantified by toxicity data, and least sensitive to compo-nents quantified by expert knowledge. The model is publicly available through a web interface. Fur-ther development of this model should include additional lines of evidence, refinement of the discre-tisation, and training with a larger dataset for weighting of the lines of evidence. A refined version of this model can be a useful tool for predicting acute fish toxicity, and a contribution to more quantitative WOE approaches for ecotoxicology and environmental assessment more generally.publishedVersio

    A Decision Support System to Predict Acute Fish Toxicity

    Get PDF
    We present a decision support system using a Bayesian network to predict acute fish toxicity from multiple lines of evidence. Fish embryo toxicity testing has been proposed as an alternative to using juvenile or adult fish in acute toxicity testing for hazard assessments of chemicals. The European Chemicals Agency has recommended the development of a so-called weight-of-evidence approach for strengthening the evidence from fish embryo toxicity testing. While weight-of-evidence approaches in the ecotoxicology and ecological risk assessment community in the past have been largely qualitative, we have developed a Bayesian network for using fish embryo toxicity data in a quantitative approach. The system enables users to efficiently predict the potential toxicity of a chemical substance based on multiple types of evidence including physical and chemical properties, quantitative structure-activity relationships, toxicity to algae and daphnids, and fish gill cytotoxicity. The system is demonstrated on three chemical substances of different levels of toxicity. It is considered as a promising step towards a probabilistic weight-of-evidence approach to predict acute fish toxicity from fish embryo toxicity.publishedVersio

    Integrating in silico models and read-across methods for predicting toxicity of chemicals: A step-wise strategy

    Get PDF
    Abstract In silico methods and models are increasingly used for predicting properties of chemicals for hazard identification and hazard characterisation in the absence of experimental toxicity data. Many in silico models are available and can be used individually or in an integrated fashion. Whilst such models offer major benefits to toxicologists, risk assessors and the global scientific community, the lack of a consistent framework for the integration of in silico results can lead to uncertainty and even contradictions across models and users, even for the same chemicals. In this context, a range of methods for integrating in silico results have been proposed on a statistical or case-specific basis. Read-across constitutes another strategy for deriving reference points or points of departure for hazard characterisation of untested chemicals, from the available experimental data for structurally-similar compounds, mostly using expert judgment. Recently a number of software systems have been developed to support experts in this task providing a formalised and structured procedure. Such a procedure could also facilitate further integration of the results generated from in silico models and read-across. This article discusses a framework on weight of evidence published by EFSA to identify the stepwise approach for systematic integration of results or values obtained from these "non-testing methods". Key criteria and best practices for selecting and evaluating individual in silico models are also described, together with the means to combining the results, taking into account any limitations, and identifying strategies that are likely to provide consistent results

    From Molecular Descriptors to Intrinsic Fish Toxicity of Chemicals:An Alternative Approach to Chemical Prioritization

    Get PDF
    The European and U.S. chemical agencies have listed approximately 800k chemicals about which knowledge of potential risks to human health and the environment is lacking. Filling these data gaps experimentally is impossible, so in silico approaches and prediction are essential. Many existing models are however limited by assumptions (e.g., linearity and continuity) and small training sets. In this study, we present a supervised direct classification model that connects molecular descriptors to toxicity. Categories can be driven by either data (using k-means clustering) or defined by regulation. This was tested via 907 experimentally defined 96 h LC50 values for acute fish toxicity. Our classification model explained ≈90% of the variance in our data for the training set and ≈80% for the test set. This strategy gave a 5-fold decrease in the frequency of incorrect categorization compared to a quantitative structure-activity relationship (QSAR) regression model. Our model was subsequently employed to predict the toxicity categories of ≈32k chemicals. A comparison between the model-based applicability domain (AD) and the training set AD was performed, suggesting that the training set-based AD is a more adequate way to avoid extrapolation when using such models. The better performance of our direct classification model compared to that of QSAR methods makes this approach a viable tool for assessing the hazards and risks of chemicals

    Alternative methods for regulatory toxicology – a state-of-the-art review

    Get PDF
    This state-of-the art review is based on the final report of a project carried out by the European Commission’s Joint Research Centre (JRC) for the European Chemicals Agency (ECHA). The aim of the project was to review the state of the science of non-standard methods that are available for assessing the toxicological and ecotoxicological properties of chemicals. Non-standard methods refer to alternatives to animal experiments, such as in vitro tests and computational models, as well as animal methods that are not covered by current regulatory guidelines. This report therefore reviews the current scientific status of non-standard methods for a range of human health and ecotoxicological endpoints, and provides a commentary on the mechanistic basis and regulatory applicability of these methods. For completeness, and to provide context, currently accepted (standard) methods are also summarised. In particular, the following human health endpoints are covered: a) skin irritation and corrosion; b) serious eye damage and eye irritation; c) skin sensitisation; d) acute systemic toxicity; e) repeat dose toxicity; f) genotoxicity and mutagenicity; g) carcinogenicity; h) reproductive toxicity (including effects on development and fertility); i) endocrine disruption relevant to human health; and j) toxicokinetics. In relation to ecotoxicological endpoints, the report focuses on non-standard methods for acute and chronic fish toxicity. While specific reference is made to the information needs of REACH, the Biocidal Products Regulation and the Classification, Labelling and Packaging Regulation, this review is also expected to be informative in relation to the possible use of alternative and non-standard methods in other sectors, such as cosmetics and plant protection products.JRC.I.5-Systems Toxicolog

    Evolutionary Computation and QSAR Research

    Get PDF
    [Abstract] The successful high throughput screening of molecule libraries for a specific biological property is one of the main improvements in drug discovery. The virtual molecular filtering and screening relies greatly on quantitative structure-activity relationship (QSAR) analysis, a mathematical model that correlates the activity of a molecule with molecular descriptors. QSAR models have the potential to reduce the costly failure of drug candidates in advanced (clinical) stages by filtering combinatorial libraries, eliminating candidates with a predicted toxic effect and poor pharmacokinetic profiles, and reducing the number of experiments. To obtain a predictive and reliable QSAR model, scientists use methods from various fields such as molecular modeling, pattern recognition, machine learning or artificial intelligence. QSAR modeling relies on three main steps: molecular structure codification into molecular descriptors, selection of relevant variables in the context of the analyzed activity, and search of the optimal mathematical model that correlates the molecular descriptors with a specific activity. Since a variety of techniques from statistics and artificial intelligence can aid variable selection and model building steps, this review focuses on the evolutionary computation methods supporting these tasks. Thus, this review explains the basic of the genetic algorithms and genetic programming as evolutionary computation approaches, the selection methods for high-dimensional data in QSAR, the methods to build QSAR models, the current evolutionary feature selection methods and applications in QSAR and the future trend on the joint or multi-task feature selection methods.Instituto de Salud Carlos III, PIO52048Instituto de Salud Carlos III, RD07/0067/0005Ministerio de Industria, Comercio y Turismo; TSI-020110-2009-53)Galicia. Consellería de Economía e Industria; 10SIN105004P

    The Use of Computational Methods in the Toxicological Assessment of Chemicals in Food: Current Status and Future Prospects

    Get PDF
    A wide range of chemicals are intentionally added to, or unintentially found in, food products, often in very small amounts. Depending on the situation, the experimental data needed to complete a dietary risk assessment, which is the scientific basis for protecting human health, may not be available or obtainable, for reasons of cost, time and animal welfare. For example, toxicity data are often lacking for the metabolites and degradation products of pesticide active ingredients. There is therefore an interest in the development and application of efficient and effective non-animal methods for assessing chemical toxicity, including Quantitative Structure-Activity Relationship (QSAR) models and related computational methods. This report gives an overview of how computational methods are currently used in the field of food safety by national regulatory bodies, international advisory organisations and the food industry. On the basis of an international survey, a comprehensive literature review and a detailed QSAR analysis, a range of recommendations are made with the long-term aim of promoting the judicious use of suitable QSAR methods. The current status of QSAR methods is reviewed not only for toxicological endpoints relevant to dietary risk assessment, but also for Absorption, Distribution, Metabolism and Excretion (ADME) properties, which are often important in discriminating between the toxicological profiles of parent compounds and their reaction products. By referring to the concept of the Threshold of Toxicological Concern (TTC), the risk assessment context in which QSAR methods can be expected to be used is also discussed. This Joint Research Centre (JRC) Reference Report provides a summary and update of the findings obtained in a study carried out by the JRC under the terms of a contract awarded by the European Food Safety Authority (EFSA).JRC.DG.I.6-Systems toxicolog

    Systems level analysis of non-model organisms: a tool for understanding environmental stress

    Get PDF
    Omics techniques are changing the focus of ecotoxicology. In addition to challenges resulting from large amounts of data, there are further difficulties for non-model species: from lack of annotation to limited number of additional databases for molecular interactions and functions. In this thesis, I demonstrate the use of systems biology to relate molecular measurements to physiological parameters in non-model species in the context of environmental stress. Firstly, I make dynamical data-driven model of how gene expression changes in relation to the nerve conductance in earthworm Eisenia fetida exposed to single chemicals in the laboratory. The model reveals that gene expression changes might reflect the recovery from nerve damage. Using a similar approach, I use blue mussel Mytilus edulis sampled from their natural environment to model their annual cycle, integrating 1H-NMR metabolite levels with physiological and environmental parameters. I challenge this model created from data from a reference site to see site-effects for mussels sampled from an industrial harbour. Finally, I use systems biology to relate changing chemical concentrations and traditional toxicity assays in an effluent remediation system to stickleback gene expression and morphology. I demonstrate that data-driven systems biology can help the interpretation of complex problems. Supplementary data for this thesis can be found on the University of Birmingham eData repository at: https://doi.org/10.25500/edata.bham.0000065
    • …
    corecore