68 research outputs found

    OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The OpenTox Framework, developed by the partners in the OpenTox project (<url>http://www.opentox.org</url>), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing.</p> <p>Results</p> <p>The following related ontologies have been developed for OpenTox: a) Toxicological ontology – listing the toxicological endpoints; b) Organs system and Effects ontology – addressing organs, targets/examinations and effects observed in <it>in vivo</it> studies; c) ToxML ontology – representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology– representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink–ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.</p> <p>OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.</p> <p>The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists).</p> <p>Availability</p> <p>The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page <url>http://www.opentox.org/dev/ontology</url>; the OpenTox ontology is available as OWL at <url>http://opentox.org/api/1 1/opentox.owl</url>, the ToxML - OWL conversion utility is an open source resource available at <url>http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/</url></p

    Deliverable Raport D4.6 Tools for generating QMRF and QPRF reports

    Get PDF
    Scientific reports carry significant importance for the straightforward and effective transfer of knowledge, results and ideas. Good practice dictates that reports should be well-structured and concise. This deliverable describes the reporting services for models, predictions and validation tasks that have been integrated within the eNanoMapper (eNM) modelling infrastructure. Validation services have been added to the Jaqpot Quattro (JQ) modelling platform and the nano-lazar read-across framework developed within WP4 to support eNM modelling activities. Moreover, we have proceeded with the development of reporting services for predictions and models, respectively QPRF and QMRF reports. Therefore, in this deliverable, we first describe the three validation schemes created, namely training set split, cross- and external validation in detail and demonstrate their functionality both on API and UI levels. We then proceed with the description of the read across functionalities and finally, we present and describe the QPRF and QMRF reporting services

    Computational toxicology using the OpenTox application programming interface and Bioclipse

    Get PDF
    BACKGROUND: Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. FINDINGS: This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. CONCLUSIONS: A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers

    Materials characterisation and software tools as key enablers in Industry 5.0 and wider acceptance of new methods and products

    Get PDF
    Recently, the NMBP-35 Horizon 2020 projects -NanoMECommons, CHARISMA, and Easi-stress -organised a collaborative workshop to increase awareness of their contributions to the industry "commons" in terms of characterisation and digital transformation. They have established interoperability standards for knowledge management in characterisation and introduced new solutions for materials testing, aided by the standardisation of faster and more accurate assessment methods. The lessons learned from these projects and the discussions during the joint workshop emphasised the impact of recent developments and emerging needs in the field of characterisation. Specifically, the focus was on enhancing data quality through harmonisation and stand-ardisation, as well as making advanced technologies and instruments accessible to a broader community with the goal of fostering increased trust in new products and a more skilled society. Experts also highlighted how characterisation and the corresponding experimental data can drive future innovation agendas towards tech-nological breakthroughs. The focus of the discussion revolved around the characterisation and standardisation processes, along with the collection of modelling and characterisation tools, as well as protocols for data ex-change. The broader context of materials characterisation and modelling within the materials community was explored, drawing insights from the Materials 2030 Roadmap and the experiences gained from NMBP-35 pro-jects. This whitepaper has the objective of addressing common challenges encountered by the materials com-munity, illuminating emerging trends and evolving techniques, and presenting the industry's perspective on emerging requirements and past success stories. It accomplishes this by providing specific examples and high-lighting how these experiences can create fresh opportunities and strategies for newcomers entering the market. These advancements are anticipated to facilitate a more efficient transition from Industry 4.0 to 5.0 during the industrial revolution

    Risk Governance of Emerging Technologies Demonstrated in Terms of its Applicability to Nanomaterials

    Get PDF
    Nanotechnologies have reached maturity and market penetration that require nano-specific changes in legislation and harmonization among legislation domains, such as the amendments to REACH for nanomaterials (NMs) which came into force in 2020. Thus, an assessment of the components and regulatory boundaries of NMs risk governance is timely, alongside related methods and tools, as part of the global efforts to optimise nanosafety and integrate it into product design processes, via Safe(r)-by-Design (SbD) concepts. This paper provides an overview of the state-of-the-art regarding risk governance of NMs and lays out the theoretical basis for the development and implementation of an effective, trustworthy and transparent risk governance framework for NMs. The proposed framework enables continuous integration of the evolving state of the science, leverages best practice from contiguous disciplines and facilitates responsive re-thinking of nanosafety governance to meet future needs. To achieve and operationalise such framework, a science-based Risk Governance Council (RGC) for NMs is being developed. The framework will provide a toolkit for independent NMs' risk governance and integrates needs and views of stakeholders. An extension of this framework to relevant advanced materials and emerging technologies is also envisaged, in view of future foundations of risk research in Europe and globally

    A computational view on nanomaterial intrinsic and extrinsic features for nanosafety and sustainability

    Get PDF
    In recent years, an increasing number of diverse Engineered Nano-Materials (ENMs), such as nanoparticles and nanotubes, have been included in many technological applications and consumer products. The desirable and unique properties of ENMs are accompanied by potential hazards whose impacts are difficult to predict either qualitatively or in a quantitative and predictive manner. Alongside established methods for experimental and computational characterisation, physics-based modelling tools like molecular dynamics are increasingly considered in Safe and Sustainability-by-design (SSbD) strategies that put user health and environmental impact at the centre of the design and development of new products. Hence, the further development of such tools can support safe and sustainable innovation and its regulation. This paper stems from a community effort and presents the outcome of a four-year-long discussion on the benefits, capabilities and limitations of adopting physics-based modelling for computing suitable features of nanomaterials that can be used for toxicity assessment of nanomaterials in combination with data-based models and experimental assessment of toxicity endpoints. We review modern multiscale physics-based models that generate advanced system-dependent (intrinsic) or timeand environment-dependent (extrinsic) descriptors/features of ENMs (primarily, but not limited to nanoparticles, NPs), with the former being related to the bare NPs and the latter to their dynamic fingerprinting upon entering biological media. The focus is on (i) effectively representing all nanoparticle attributes for multicomponent nanomaterials, (ii) generation and inclusion of intrinsic nanoform properties, (iii) inclusion of selected extrinsic properties, (iv) the necessity of considering distributions of structural advanced features rather than only averages. This review enables us to identify and highlight a number of key challenges associated with ENMs’ data generation, curation, representation and use within machine learning or other advanced data-driven models to ultimately enhance toxicity assessment. Finally, the set up of dedicated databases as well as the development of grouping and read-across strategies based on the mode of action of ENMs using omics methods are identified as emerging methodologies for safety assessment and reduction of animal testing

    Harmonising knowledge for safer materials via the “NanoCommons” Knowledge Base

    Get PDF
    In mediaeval Europe, the term “commons” described the way that communities managed land that was held “in common” and provided a clear set of rules for how this “common land” was used and developed by, and for, the community. Similarly, as we move towards an increasingly knowledge-based society where data is the new oil, new approaches to sharing and jointly owning publicly funded research data are needed to maximise its added value. Such common management approaches will extend the data’s useful life and facilitate its reuse for a range of additional purposes, from modelling, to meta-analysis to regulatory risk assessment as examples relevant to nanosafety data. This “commons” approach to nanosafety data and nanoinformatics infrastructure provision, co-development, and maintenance is at the heart of the “NanoCommons” project and underpins its post-funding transition to providing a basis on which other initiatives and projects can build. The present paper summarises part of the NanoCommons infrastructure called the NanoCommons Knowledge Base. It provides interoperability for nanosafety data sources and tools, on both semantic and technical levels. The NanoCommons Knowledge Base connects knowledge and provides both programmatic (via an Application Programming Interface) and a user-friendly graphical interface to enable (and democratise) access to state of the art tools for nanomaterials safety prediction, NMs design for safety and sustainability, and NMs risk assessment, as well. In addition, the standards and interfaces for interoperability, e.g., file templates to contribute data to the NanoCommons, are described, and a snapshot of the range and breadth of nanoinformatics tools and models that have already been integrated are presented Finally, we demonstrate how the NanoCommons Knowledge Base can support users in the FAIRification of their experimental workflows and how the NanoCommons Knowledge Base itself has progressed towards richer compliance with the FAIR principles

    Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are.Abstract Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community.Peer Reviewe

    CheS-Mapper - Chemical Space Mapping and Visualization in 3D

    Get PDF
    Analyzing chemical datasets is a challenging task for scientific researchers in the field of chemoinformatics. It is important, yet difficult to understand the relationship between the structure of chemical compounds, their physico-chemical properties, and biological or toxic effects. To that respect, visualization tools can help to better comprehend the underlying correlations. Our recently developed 3D molecular viewer CheS-Mapper (Chemical Space Mapper) divides large datasets into clusters of similar compounds and consequently arranges them in 3D space, such that their spatial proximity reflects their similarity. The user can indirectly determine similarity, by selecting which features to employ in the process. The tool can use and calculate different kind of features, like structural fragments as well as quantitative chemical descriptors. These features can be highlighted within CheS-Mapper, which aids the chemist to better understand patterns and regularities and relate the observations to established scientific knowledge. As a final function, the tool can also be used to select and export specific subsets of a given dataset for further analysis

    Materials characterisation and software tools as key enablers in Industry 5.0 and wider acceptance of new methods and products

    Get PDF
    Recently, the NMBP-35 Horizon 2020 projects - NanoMECommons, CHARISMA, and Easi-stress - organised a collaborative workshop to increase awareness of their contributions to the industry “commons” in terms of characterisation and digital transformation. They have established interoperability standards for knowledge management in characterisation and introduced new solutions for materials testing, aided by the standardisation of faster and more accurate assessment methods. The lessons learned from these projects and the discussions during the joint workshop emphasised the impact of recent developments and emerging needs in the field of characterisation. Specifically, the focus was on enhancing data quality through harmonisation and standardisation, as well as making advanced technologies and instruments accessible to a broader community with the goal of fostering increased trust in new products and a more skilled society. Experts also highlighted how characterisation and the corresponding experimental data can drive future innovation agendas towards technological breakthroughs. The focus of the discussion revolved around the characterisation and standardisation processes, along with the collection of modelling and characterisation tools, as well as protocols for data exchange. The broader context of materials characterisation and modelling within the materials community was explored, drawing insights from the Materials 2030 Roadmap and the experiences gained from NMBP-35 projects. This whitepaper has the objective of addressing common challenges encountered by the materials community, illuminating emerging trends and evolving techniques, and presenting the industry's perspective on emerging requirements and past success stories. It accomplishes this by providing specific examples and highlighting how these experiences can create fresh opportunities and strategies for newcomers entering the market. These advancements are anticipated to facilitate a more efficient transition from Industry 4.0 to 5.0 during the industrial revolution. © 2023The Workshop was supported by EU H2020 project NanoMECommons, GA 952869, CHARISMA, GA 952921, EASI-STRESS, GA 953219, and EsSENce COST ACTION CA19118. This article/publication is based upon work from COST Action EsSENce COST ACTION CA19118, supported by COST (European Cooperation in Science and Technology). Miguel A. Bañares, Raquel Portela, Nina Jeliazkova, Enrique Lozano, Bastian Barton and Iván Moya have received financial support from the EU H2020 project CHARISMA, GA n. 952921, Bojan Boskovic, Ennio Capria, Costas Charitidis, Donna Dykeman, Spyros Diplas, Gerhard Goldbeck, Marco Sebastiani, Elias Koumoulos, Silvia Giovanna Avataneo, Miguel A. Bañares, Raquel Portela, Anastasia Alexandratou, Athanasios Katsavrias, Fotis Mystakopoulos have received financial support from the EU H2020 project NanoMECommons, GA n. 952869, Nikolaj Zangernberg and Ennio Capria have received financial support from the EU H2020 project EASI-STRESS, GA n. 953219, Natalia Konchakova has received financial support from the EU H2020 project VIPCOAT, GA n. 952903, Costas Charitidis, Elias Koumoulos, and Spyros Diplas have received financial support from the EsSENce COST ACTION CA19118. All authors would like to specially acknowledge Anastasia Alexandratou, Athanasios Katsavrias and Fotis Mystakopoulos for their support in NMBP-35 joint Workshop organisation and documentation, and Steffen Neumann for his insights during the NMBP-35 joint Workshop discussions.Peer reviewe
    • …
    corecore