99 research outputs found

    The Use of Computational Methods in the Toxicological Assessment of Chemicals in Food: Current Status and Future Prospects

    Get PDF
    A wide range of chemicals are intentionally added to, or unintentially found in, food products, often in very small amounts. Depending on the situation, the experimental data needed to complete a dietary risk assessment, which is the scientific basis for protecting human health, may not be available or obtainable, for reasons of cost, time and animal welfare. For example, toxicity data are often lacking for the metabolites and degradation products of pesticide active ingredients. There is therefore an interest in the development and application of efficient and effective non-animal methods for assessing chemical toxicity, including Quantitative Structure-Activity Relationship (QSAR) models and related computational methods. This report gives an overview of how computational methods are currently used in the field of food safety by national regulatory bodies, international advisory organisations and the food industry. On the basis of an international survey, a comprehensive literature review and a detailed QSAR analysis, a range of recommendations are made with the long-term aim of promoting the judicious use of suitable QSAR methods. The current status of QSAR methods is reviewed not only for toxicological endpoints relevant to dietary risk assessment, but also for Absorption, Distribution, Metabolism and Excretion (ADME) properties, which are often important in discriminating between the toxicological profiles of parent compounds and their reaction products. By referring to the concept of the Threshold of Toxicological Concern (TTC), the risk assessment context in which QSAR methods can be expected to be used is also discussed. This Joint Research Centre (JRC) Reference Report provides a summary and update of the findings obtained in a study carried out by the JRC under the terms of a contract awarded by the European Food Safety Authority (EFSA).JRC.DG.I.6-Systems toxicolog

    Perspectives for integrating human and environmental risk assessment and synergies with socio-economic analysis

    Get PDF
    International audienceFor more than a decade, the integration of human and environmental risk assessment (RA) has become an attractive vision. At the same time, existing European regulations of chemical substances such as REACH (EC Regulation No. 1907/2006), the Plant Protection Products Regulation (EC regulation 1107/2009) and Biocide Regulation (EC Regulation 528/2012) continue to ask for sector-specific RAs, each of which have their individual information requirements regarding exposure and hazard data, and also use different methodologies for the ultimate risk quantification. In response to this difference between the vision for integration and the current scientific and regulatory practice, the present paper outlines five medium-term opportunities for integrating human and environmental RA, followed by detailed discussions of the associated major components and their state of the art. Current hazard assessment approaches are analyzed in terms of data availability and quality, and covering non-test tools, the integrated testing strategy (ITS) approach, the adverse outcome pathway (AOP) concept, methods for assessing uncertainty, and the issue of explicitly treating mixture toxicity. With respect to exposure, opportunities for integrating exposure assessment are discussed, taking into account the uncertainty, standardization and validation of exposure modeling as well as the availability of exposure data. A further focus is on ways to complement RA by a socio-economic assessment (SEA) in order to better inform about risk management options. In this way, the present analysis, developed as part of the EU FP7 project HEROIC, may contribute to paving the way for integrating, where useful and possible, human and environmental RA in a manner suitable for its coupling with SEA

    Collaborative development of predictive toxicology applications

    Get PDF
    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals

    Integration of Toxicity Data from Experiments and Non-Testing Methods within a Weight of Evidence Procedure

    Get PDF
    Assessment of human health and environmental risk is based on multiple sources of information, requiring the integration of the lines of evidence in order to reach a conclusion. There is an increasing need for data to fill the gaps and new methods for the data integration. From a regulatory point of view, risk assessors take advantage of all the available data by means of weight of evidence (WOE) and expert judgement approaches to develop conclusions about the risk posed by chemicals and also nanoparticles. The integration of the physico-chemical properties and toxicological effects shed light on relationships between the molecular properties and biological effects, leading us to non-testing methods. (Quantitative) structure-activity relationship ((Q)SAR) and read-across are examples of non-testing methods. In this dissertation, (i) two new structure-based carcinogenicity models, (ii) ToxDelta, a new read-across model for mutagenicity endpoint and (iii) a genotoxicity model for the metal oxide nanoparticles are introduced. Within the latter section, best professional judgement method is employed for the selection of reliable data from scientific publications to develop a data base of nanomaterials with their genotoxicity effect. We developed a decision tree model for the classification of these nanomaterials. The (Q)SAR models used in qualitative WOE approaches mainly lack transparency resulting in risk estimates needing quantified uncertainties. Our two structure-based carcinogenicity models, provide transparent reasoning in their predictions. Additionally, ToxDelta provides better supported techniques in read-across terms based on the analysis of the differences of the molecules structures. We propose a basic qualitative WOE framework that couples the in silico models predictions with the inspections of the similar compounds. We demonstrate the application of this framework to two realistic case studies, and discuss how to deal with different and sometimes conflicting data obtained from various in silico models in qualitative WOE terms to facilitate structured and transparent development of answers to scientific questions

    Integrating computational methods to predict mutagenicity of aromatic azo compounds

    Get PDF
    Azo dyes have several industrial uses. However, these azo dyes and their degradation products showed mutagenicity, inducing damage in environmental and human systems. Computational methods are proposed as cheap and rapid alternatives to predict the toxicity of azo dyes. A benchmark dataset of Ames data for 354 azo dyes was employed to develop three classification strategies using knowledge-based methods and docking simulations. Results were compared and integrated with three models from the literature, developing a series of consensus strategies. The good results confirm the usefulness of in silico methods as a support for experimental methods to predict the mutagenicity of azo compounds

    Chemical Similarity and Threshold of Toxicological Concern (TTC) Approaches: Report of an ECB Workshop held in Ispra, November 2005

    Get PDF
    There are many national, regional and international programmes – either regulatory or voluntary – to assess the hazards or risks of chemical substances to humans and the environment. The first step in making a hazard assessment of a chemical is to ensure that there is adequate information on each of the endpoints. If adequate information is not available then additional data is needed to complete the dataset for this substance. For reasons of resources and animal welfare, it is important to limit the number of tests that have to be conducted, where this is scientifically justifiable. One approach is to consider closely related chemicals as a group, or chemical category, rather than as individual chemicals. In a category approach, data for chemicals and endpoints that have been already tested are used to estimate the hazard for untested chemicals and endpoints. Categories of chemicals are selected on the basis of similarities in biological activity which is associated with a common underlying mechanism of action. A homologous series of chemicals exhibiting a coherent trend in biological activity can be rationalised on the basis of a constant change in structure. This type of grouping is relatively straightforward. The challenge lies in identifying the relevant chemical structural and physicochemical characteristics that enable more sophisticated groupings to be made on the basis of similarity in biological activity and hence purported mechanism of action. Linking two chemicals together and rationalising their similarity with reference to one or more endpoints has been very much carried out on an ad hoc basis. Even with larger groups, the process and approach is ad hoc and based on expert judgement. There still appears to be very little guidance about the tools and approaches for grouping chemicals systematically. In November 2005, the ECB Workshop on Chemical Similarity and Thresholds of Toxicological Concern (TTC) Approaches was convened to identify the available approaches that currently exist to encode similarity and how these can be used to facilitate the grouping of chemicals. This report aims to capture the main themes that were discussed. In particular, it outlines a number of different approaches that can facilitate the formation of chemical groupings in terms of the context under consideration and the likely information that would be required. Grouping methods were divided into one of four classes – knowledge-based, analogue-based, unsupervised, and supervised. A flowchart was constructed to attempt to capture a possible work flow to highlight where and how these approaches might be best applied.JRC.I.3-Toxicology and chemical substance

    Alternative methods for regulatory toxicology – a state-of-the-art review

    Get PDF
    This state-of-the art review is based on the final report of a project carried out by the European Commission’s Joint Research Centre (JRC) for the European Chemicals Agency (ECHA). The aim of the project was to review the state of the science of non-standard methods that are available for assessing the toxicological and ecotoxicological properties of chemicals. Non-standard methods refer to alternatives to animal experiments, such as in vitro tests and computational models, as well as animal methods that are not covered by current regulatory guidelines. This report therefore reviews the current scientific status of non-standard methods for a range of human health and ecotoxicological endpoints, and provides a commentary on the mechanistic basis and regulatory applicability of these methods. For completeness, and to provide context, currently accepted (standard) methods are also summarised. In particular, the following human health endpoints are covered: a) skin irritation and corrosion; b) serious eye damage and eye irritation; c) skin sensitisation; d) acute systemic toxicity; e) repeat dose toxicity; f) genotoxicity and mutagenicity; g) carcinogenicity; h) reproductive toxicity (including effects on development and fertility); i) endocrine disruption relevant to human health; and j) toxicokinetics. In relation to ecotoxicological endpoints, the report focuses on non-standard methods for acute and chronic fish toxicity. While specific reference is made to the information needs of REACH, the Biocidal Products Regulation and the Classification, Labelling and Packaging Regulation, this review is also expected to be informative in relation to the possible use of alternative and non-standard methods in other sectors, such as cosmetics and plant protection products.JRC.I.5-Systems Toxicolog

    Carcinogenicity prediction of noncongeneric chemicals by augmented top priority fragment classification

    Get PDF
    Carcinogenicity prediction is an important process that can be performed to cut down experimental costs and save animal lives. The current reliability of the results is however disputed. Here, a blind exercise in carcinogenicity category assessment is performed using augmented top priority fragment classification. The procedure analyses the applicability domain of the dataset, allocates in clusters the compounds using a leading molecular fragment, and a similarity measure. The exercise is applied to three compound datasets derived from the Lois Gold Carcinogenic Database. The results, showing good agreement with experimental data, are compared with published ones. A final discussion on our viewpoint on the possibilities that the carcinogenicity modelling of chemical compounds offers is presented
    corecore