86 research outputs found

    AMBIT RESTful web services: an implementation of the OpenTox application programming interface

    Get PDF
    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations

    Deliverable Raport D4.6 Tools for generating QMRF and QPRF reports

    Get PDF
    Scientific reports carry significant importance for the straightforward and effective transfer of knowledge, results and ideas. Good practice dictates that reports should be well-structured and concise. This deliverable describes the reporting services for models, predictions and validation tasks that have been integrated within the eNanoMapper (eNM) modelling infrastructure. Validation services have been added to the Jaqpot Quattro (JQ) modelling platform and the nano-lazar read-across framework developed within WP4 to support eNM modelling activities. Moreover, we have proceeded with the development of reporting services for predictions and models, respectively QPRF and QMRF reports. Therefore, in this deliverable, we first describe the three validation schemes created, namely training set split, cross- and external validation in detail and demonstrate their functionality both on API and UI levels. We then proceed with the description of the read across functionalities and finally, we present and describe the QPRF and QMRF reporting services

    Materials characterisation and software tools as key enablers in Industry 5.0 and wider acceptance of new methods and products

    Get PDF
    Recently, the NMBP-35 Horizon 2020 projects -NanoMECommons, CHARISMA, and Easi-stress -organised a collaborative workshop to increase awareness of their contributions to the industry "commons" in terms of characterisation and digital transformation. They have established interoperability standards for knowledge management in characterisation and introduced new solutions for materials testing, aided by the standardisation of faster and more accurate assessment methods. The lessons learned from these projects and the discussions during the joint workshop emphasised the impact of recent developments and emerging needs in the field of characterisation. Specifically, the focus was on enhancing data quality through harmonisation and stand-ardisation, as well as making advanced technologies and instruments accessible to a broader community with the goal of fostering increased trust in new products and a more skilled society. Experts also highlighted how characterisation and the corresponding experimental data can drive future innovation agendas towards tech-nological breakthroughs. The focus of the discussion revolved around the characterisation and standardisation processes, along with the collection of modelling and characterisation tools, as well as protocols for data ex-change. The broader context of materials characterisation and modelling within the materials community was explored, drawing insights from the Materials 2030 Roadmap and the experiences gained from NMBP-35 pro-jects. This whitepaper has the objective of addressing common challenges encountered by the materials com-munity, illuminating emerging trends and evolving techniques, and presenting the industry's perspective on emerging requirements and past success stories. It accomplishes this by providing specific examples and high-lighting how these experiences can create fresh opportunities and strategies for newcomers entering the market. These advancements are anticipated to facilitate a more efficient transition from Industry 4.0 to 5.0 during the industrial revolution

    Computational toxicology using the OpenTox application programming interface and Bioclipse

    Get PDF
    BACKGROUND: Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. FINDINGS: This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. CONCLUSIONS: A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers

    CheS-Mapper - Chemical Space Mapping and Visualization in 3D

    Get PDF
    Analyzing chemical datasets is a challenging task for scientific researchers in the field of chemoinformatics. It is important, yet difficult to understand the relationship between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects. To that respect, visualization tools can help to better comprehend the underlying correlations. Our recently developed 3D molecular viewer CheS-Mapper (Chemical Space Mapper) divides large datasets into clusters of similar compounds and consequently arranges them in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kind of features, like structural fragments as well as quantitative chemical descriptors. These features can be highlighted within CheS-Mapper, which aids the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. As a final function, the tool can also be used to select and export specific subsets of a given dataset for further analysis

    A computational view on nanomaterial intrinsic and extrinsic features for nanosafety and sustainability

    Get PDF
    In recent years, an increasing number of diverse Engineered Nano-Materials (ENMs), such as nanoparticles and nanotubes, have been included in many technological applications and consumer products. The desirable and unique properties of ENMs are accompanied by potential hazards whose impacts are difficult to predict either qualitatively or in a quantitative and predictive manner. Alongside established methods for experimental and computational characterisation, physics-based modelling tools like molecular dynamics are increasingly considered in Safe and Sustainability-by-design (SSbD) strategies that put user health and environmental impact at the centre of the design and development of new products. Hence, the further development of such tools can support safe and sustainable innovation and its regulation. This paper stems from a community effort and presents the outcome of a four-year-long discussion on the benefits, capabilities and limitations of adopting physics-based modelling for computing suitable features of nanomaterials that can be used for toxicity assessment of nanomaterials in combination with data-based models and experimental assessment of toxicity endpoints. We review modern multiscale physics-based models that generate advanced system-dependent (intrinsic) or timeand environment-dependent (extrinsic) descriptors/features of ENMs (primarily, but not limited to nanoparticles, NPs), with the former being related to the bare NPs and the latter to their dynamic fingerprinting upon entering biological media. The focus is on (i) effectively representing all nanoparticle attributes for multicomponent nanomaterials, (ii) generation and inclusion of intrinsic nanoform properties, (iii) inclusion of selected extrinsic properties, (iv) the necessity of considering distributions of structural advanced features rather than only averages. This review enables us to identify and highlight a number of key challenges associated with ENMs’ data generation, curation, representation and use within machine learning or other advanced data-driven models to ultimately enhance toxicity assessment. Finally, the set up of dedicated databases as well as the development of grouping and read-across strategies based on the mode of action of ENMs using omics methods are identified as emerging methodologies for safety assessment and reduction of animal testing

    Risk Governance of Emerging Technologies Demonstrated in Terms of its Applicability to Nanomaterials

    Get PDF
    Nanotechnologies have reached maturity and market penetration that require nano-specific changes in legislation and harmonization among legislation domains, such as the amendments to REACH for nanomaterials (NMs) which came into force in 2020. Thus, an assessment of the components and regulatory boundaries of NMs risk governance is timely, alongside related methods and tools, as part of the global efforts to optimise nanosafety and integrate it into product design processes, via Safe(r)-by-Design (SbD) concepts. This paper provides an overview of the state-of-the-art regarding risk governance of NMs and lays out the theoretical basis for the development and implementation of an effective, trustworthy and transparent risk governance framework for NMs. The proposed framework enables continuous integration of the evolving state of the science, leverages best practice from contiguous disciplines and facilitates responsive re-thinking of nanosafety governance to meet future needs. To achieve and operationalise such framework, a science-based Risk Governance Council (RGC) for NMs is being developed. The framework will provide a toolkit for independent NMs' risk governance and integrates needs and views of stakeholders. An extension of this framework to relevant advanced materials and emerging technologies is also envisaged, in view of future foundations of risk research in Europe and globally

    A computational view on nanomaterial intrinsic and extrinsic features for nanosafety and sustainability

    Get PDF
    In recent years, an increasing number of diverse Engineered Nano-Materials (ENMs), such as nanoparticles and nanotubes, have been included in many technological applications and consumer products. The desirable and unique properties of ENMs are accompanied by potential hazards whose impacts are difficult to predict either qualitatively or in a quantitative and predictive manner. Alongside established methods for experimental and computational characterisation, physics-based modelling tools like molecular dynamics are increasingly considered in Safe and Sustainability-by-design (SSbD) strategies that put user health and environmental impact at the centre of the design and development of new products. Hence, the further development of such tools can support safe and sustainable innovation and its regulation.This paper stems from a community effort and presents the outcome of a four-year-long discussion on the benefits, capabilities and limitations of adopting physics-based modelling for computing suitable features of nanomaterials that can be used for toxicity assessment of nanomaterials in combination with data-based models and experimental assessment of toxicity endpoints. We reviewmodern multiscale physics-based models that generate advanced system-dependent (intrinsic) or time -and environment-dependent (extrinsic) descriptors/features of ENMs (primarily, but not limited to nanoparticles, NPs), with the former being related to the bare NPs and the latter to their dynamic fingerprinting upon entering biological media. The focus is on (i) effectively representing all nanoparticle attributes for multicomponent nanomaterials, (ii) generation and inclusion of intrinsic nanoform properties, (iii) inclusion of selected extrinsic properties, (iv) the necessity of considering distributions of structural advanced features rather than only averages. This review enables us to identify and highlight a number of key challenges associated with ENMs' data generation, curation, representation and use within machine learning or other advanced data-driven models to ultimately enhance toxicity assessment. Finally, the set up of dedicated databases as well as the development of grouping and read-across strategies based on the mode of action of ENMs using omics methods are identified as emerging methodologies for safety assessment and reduction of animal testing.Horizon 2020(H2020)814426Solid state NMR/Biophysical Organic ChemistrySupramolecular & Biomaterials Chemistr

    Perspectives from the NanoSafety Modelling Cluster on the validation criteria for (Q)SAR models used in nanotechnology

    Get PDF
    Nanotechnology and the production of nanomaterials have been expanding rapidly in recent years. Since many types of engineered nanoparticles are suspected to be toxic to living organisms and to have a negative impact on the environment, the process of designing new nanoparticles and their applications must be accompanied by a thorough exposure risk analysis. (Quantitative) Structure-Activity Relationship ([Q]SAR) modelling creates promising options among the available methods for the risk assessment. These in silico models can be used to predict a variety of properties, including the toxicity of newly designed nanoparticles. However, (Q)SAR models must be appropriately validated to ensure the clarity, consistency and reliability of predictions. This paper is a joint initiative from recently completed European research projects focused on developing (Q)SAR methodology for nanomaterials. The aim was to interpret and expand the guidance for the well-known “OECD Principles for the Validation, for Regulatory Purposes, of (Q)SAR Models”, with reference to nano-(Q)SAR, and present our opinions on the criteria to be fulfilled for models developed for nanoparticles
    corecore