27 research outputs found

    Toxicokinetics of silver nanoparticles in the mealworm Tenebrio molitor exposed via soil or food

    Get PDF
    Silver nanoparticles (AgNPs) may reach the soil compartment via sewage sludge or nanoagrochemical applications. Understanding how NPs interact with biological systems is crucial for an accurate hazard assessment. Therefore, this study aimed at determining the Ag toxicokinetics in the mealworm Tenebrio molitor, exposed via Lufa 2.2 soil or via food to different Ag forms (uncoated 50 nm AgNPs, paraffin coated 3–8 nm and PVP-stabilised 60 nm, Ag2S NPs 20 nm, and ionic Ag). Mealworms were exposed for 21 days followed by a 21-day elimination phase (clean soil/food). A one-compartment kinetics model with inert fraction (simulating a storage compartment, where detoxified forms are located) was used to describe Ag accumulation. Fully understanding the uptake route in mealworms is difficult. For that reason several approaches were used, showing that food, soil and pore water all are valid uptake routes, but with different importance. Silver taken up from soil pore water or from soil showed to be related to Ag dissolution in soil pore water. In general, the uptake and elimination rate constants were similar for 3–8 nm and 60 nm AgNPs and for AgNO3, but significantly different for the uncoated 50 nm AgNPs. Upon food exposure, uptake rate constants were similar for 50 nm AgNPs and AgNO3, while those for 60 nm and 3–8 nm AgNPs and for Ag2S NPs also grouped together. NP exposure in soil appeared more difficult to characterize, with different patterns obtained for the different NPs. But it was evident that upon soil or food exposure, particle characteristics highly affected Ag bioavailability and bioaccumulation. Although Ag2S NPs were taken up, their elimination was faster than for other Ag forms, showing the lowest inert fraction. The significantly different elimination rate constants suggest that the mechanism of elimination may not be the same for different AgNPs either

    Advances in De Novo Drug Design : From Conventional to Machine Learning Methods

    Get PDF
    De novo drug design is a computational approach that generates novel molecular structures from atomic building blocks with no a priori relationships. Conventional methods include structure-based and ligand-based design, which depend on the properties of the active site of a biological target or its known active binders, respectively. Artificial intelligence, including ma-chine learning, is an emerging field that has positively impacted the drug discovery process. Deep reinforcement learning is a subdivision of machine learning that combines artificial neural networks with reinforcement-learning architectures. This method has successfully been em-ployed to develop novel de novo drug design approaches using a variety of artificial networks including recurrent neural networks, convolutional neural networks, generative adversarial networks, and autoencoders. This review article summarizes advances in de novo drug design, from conventional growth algorithms to advanced machine-learning methodologies and high-lights hot topics for further development.Peer reviewe

    Manually curated transcriptomics data collection for toxicogenomic assessment of engineered nanomaterials

    Get PDF
    Toxicogenomics (TGx) approaches are increasingly applied to gain insight into the possible toxicity mechanisms of engineered nanomaterials (ENMs). Omics data can be valuable to elucidate the mechanism of action of chemicals and to develop predictive models in toxicology. While vast amounts of transcriptomics data from ENM exposures have already been accumulated, a unified, easily accessible and reusable collection of transcriptomics data for ENMs is currently lacking. In an attempt to improve the FAIRness of already existing transcriptomics data for ENMs, we curated a collection of homogenized transcriptomics data from human, mouse and rat ENM exposures in vitro and in vivo including the physicochemical characteristics of the ENMs used in each study.Peer reviewe

    In silico assessment of nanoparticle toxicity powered by the Enalos Cloud Platform:Integrating automated machine learning and synthetic data for enhanced nanosafety evaluation

    Get PDF
    The rapid advance of nanotechnology has led to the development and widespread application of nanomaterials, raising concerns regarding their potential adverse effects on human health and the environment. Traditional (experimental) methods for assessing the nanoparticles (NPs) safety are time-consuming, expensive, and resource-intensive, and raise ethical concerns due to their reliance on animals. To address these challenges, we propose an in silico workflow that serves as an alternative or complementary approach to conventional hazard and risk assessment strategies, which incorporates state-of-the-art computational methodologies. In this study we present an automated machine learning (autoML) scheme that employs dose-response toxicity data for silver (Ag), titanium dioxide (TiO2), and copper oxide (CuO) NPs. This model is further enriched with atomistic descriptors to capture the NPs’ underlying structural properties. To overcome the issue of limited data availability, synthetic data generation techniques are used. These techniques help in broadening the dataset, thus improving the representation of different NP classes. A key aspect of this approach is a novel three-step applicability domain method (which includes the development of a local similarity approach) that enhances user confidence in the results by evaluating the prediction's reliability. We anticipate that this approach will significantly expedite the nanosafety assessment process enabling regulation to keep pace with innovation, and will provide valuable insights for the design and development of safe and sustainable NPs. The ML model developed in this study is made available to the scientific community as an easy-to-use web-service through the Enalos Cloud Platform (www.enaloscloud.novamechanics.com/sabydoma/safenanoscope/), facilitating broader access and collaborative advancements in nanosafety.</p

    The role of FAIR nanosafety data and nanoinformatics in achieving the UN sustainable development goals: the NanoCommons experience†

    Get PDF
    The increasing focus on open and FAIR (Findable, Accessible, Interoperable and Re-useable) data is driving a step-change in how research communities and governments think about data and knowledge, and the potential for re-use of data. It has long been recognised that international data sharing is essential for regulatory harmonisation and commercialisation, via the Mutual Acceptance of Data (MAD) principle of the Organisation for Economic Cooperation and Development (OECD) for example. However, it is interesting to note that despite the power of data and data-driven software to support the achievement of the United Nations Sustainable Development Goals (UN SDGs), there appears to be limited awareness of how nanomaterials environmental health and safety (nano EHS) data can drive progress towards many of the SDGs. The goal of the NanoCommons research infrastructure project was to increase FAIRness and impact of nanoEHS data through development of services, including data shepherding to support researchers across the data life cycle and tools such as user-friendly nanoinformatics predictive models. We surveyed both service providers and service users on their ideas regarding how nanoEHS data might support the SDGs, and discovered a significant lack of awareness of the SDGs in general, and the potential for impact from NanoCommons tools and services. To address this gap, a workshop on the SDGs was prepared and delivered to support the NanoCommons service providers to understand the SDGs and how nanosafety data and nanoinformatics can support their achievement. Following the workshop, providers were invited to update their questionnaire responses. The results from the workshop discussions are presented, along with a summary of the 12 SDGs identified where increasingly accessible nanoEHS data will have a significant impact, and the 5 that are indirectly benefited along with some recommendations for EU-funded projects on how they can maximise and monitor their contributions to the SDGs

    Metadata stewardship in nanosafety research: learning from the past, preparing for an “on-the-fly” FAIR future

    Get PDF
    Introduction: Significant progress has been made in terms of best practice in research data management for nanosafety. Some of the underlying approaches to date are, however, overly focussed on the needs of specific research projects or aligned to a single data repository, and this “silo” approach is hampering their general adoption by the broader research community and individual labs.Methods: State-of-the-art data/knowledge collection, curation management FAIrification, and sharing solutions applied in the nanosafety field are reviewed focusing on unique features, which should be generalised and integrated into a functional FAIRification ecosystem that addresses the needs of both data generators and data (re)users.Results: The development of data capture templates has focussed on standardised single-endpoint Test Guidelines, which does not reflect the complexity of real laboratory processes, where multiple assays are interlinked into an overall study, and where non-standardised assays are developed to address novel research questions and probe mechanistic processes to generate the basis for read-across from one nanomaterial to another. By focussing on the needs of data providers and data users, we identify how existing tools and approaches can be re-framed to enable “on-the-fly” (meta) data definition, data capture, curation and FAIRification, that are sufficiently flexible to address the complexity in nanosafety research, yet harmonised enough to facilitate integration of datasets from different sources generated for different research purposes. By mapping the available tools for nanomaterials safety research (including nanomaterials characterisation, nonstandard (mechanistic-focussed) methods, measurement principles and experimental setup, environmental fate and requirements from new research foci such as safe and sustainable by design), a strategy for integration and bridging between silos is presented. The NanoCommons KnowledgeBase has shown how data from different sources can be integrated into a one-stop shop for searching, browsing and accessing data (without copying), and thus how to break the boundaries between data silos.Discussion: The next steps are to generalise the approach by defining a process to build consensus (meta)data standards, develop solutions to make (meta)data more machine actionable (on the fly ontology development) and establish a distributed FAIR data ecosystem maintained by the community beyond specific projects. Since other multidisciplinary domains might also struggle with data silofication, the learnings presented here may be transferrable to facilitate data sharing within other communities and support harmonization of approaches across disciplines to prepare the ground for cross-domain interoperability

    Metadata stewardship in nanosafety research: learning from the past, preparing for an "on-the-fly" FAIR future

    Get PDF
    Introduction: Significant progress has been made in terms of best practice in research data management for nanosafety. Some of the underlying approaches to date are, however, overly focussed on the needs of specific research projects or aligned to a single data repository, and this “silo” approach is hampering their general adoption by the broader research community and individual labs. Methods: State-of-the-art data/knowledge collection, curation management FAIRification, and sharing solutions applied in the nanosafety field are reviewed focusing on unique features, which should be generalised and integrated into a functional FAIRification ecosystem that addresses the needs of both data generators and data (re)users. Results: The development of data capture templates has focussed on standardised single-endpoint Test Guidelines, which does not reflect the complexity of real laboratory processes, where multiple assays are interlinked into an overall study, and where non-standardised assays are developed to address novel research questions and probe mechanistic processes to generate the basis for read-across from one nanomaterial to another. By focussing on the needs of data providers and data users, we identify how existing tools and approaches can be re-framed to enable “on-the-fly” (meta) data definition, data capture, curation and FAIRification, that are sufficiently flexible to address the complexity in nanosafety research, yet harmonised enough to facilitate integration of datasets from different sources generated for different research purposes. By mapping the available tools for nanomaterials safety research (including nanomaterials characterisation, non-standard (mechanistic-focussed) methods, measurement principles and experimental setup, environmental fate and requirements from new research foci such as safe and sustainable by design), a strategy for integration and bridging between silos is presented. The NanoCommons KnowledgeBase has shown how data from different sources can be integrated into a one-stop shop for searching, browsing and accessing data (without copying), and thus how to break the boundaries between data silos. Discussion: The next steps are to generalise the approach by defining a process to build consensus (meta)data standards, develop solutions to make (meta)data more machine actionable (on the fly ontology development) and establish a distributed FAIR data ecosystem maintained by the community beyond specific projects. Since other multidisciplinary domains might also struggle with data silofication, the learnings presented here may be transferable to facilitate data sharing within other communities and support harmonization of approaches across disciplines to prepare the ground for cross-domain interoperability. Visit WorldFAIR online at http://worldfair-project.eu. WorldFAIR is funded by the EC HORIZON-WIDERA-2021-ERA-01-41 Coordination and Support Action under Grant Agreement No. 101058393

    Metadata stewardship in nanosafety research: learning from the past, preparing for an "on-the-fly" FAIR future

    Get PDF
    Introduction: Significant progress has been made in terms of best practice in research data management for nanosafety. Some of the underlying approaches to date are, however, overly focussed on the needs of specific research projects or aligned to a single data repository, and this "silo" approach is hampering their general adoption by the broader research community and individual labs.Methods: State-of-the-art data/knowledge collection, curation management FAIrification, and sharing solutions applied in the nanosafety field are reviewed focusing on unique features, which should be generalised and integrated into a functional FAIRification ecosystem that addresses the needs of both data generators and data (re)users.Results: The development of data capture templates has focussed on standardised single-endpoint Test Guidelines, which does not reflect the complexity of real laboratory processes, where multiple assays are interlinked into an overall study, and where non-standardised assays are developed to address novel research questions and probe mechanistic processes to generate the basis for read-across from one nanomaterial to another. By focussing on the needs of data providers and data users, we identify how existing tools and approaches can be re-framed to enable "on-the-fly" (meta) data definition, data capture, curation and FAIRification, that are sufficiently flexible to address the complexity in nanosafety research, yet harmonised enough to facilitate integration of datasets from different sources generated for different research purposes. By mapping the available tools for nanomaterials safety research (including nanomaterials characterisation, nonstandard (mechanistic-focussed) methods, measurement principles and experimental setup, environmental fate and requirements from new research foci such as safe and sustainable by design), a strategy for integration and bridging between silos is presented. The NanoCommons KnowledgeBase has shown how data from different sources can be integrated into a one-stop shop for searching, browsing and accessing data (without copying), and thus how to break the boundaries between data silos.Discussion: The next steps are to generalise the approach by defining a process to build consensus (meta)data standards, develop solutions to make (meta)data more machine actionable (on the fly ontology development) and establish a distributed FAIR data ecosystem maintained by the community beyond specific projects. Since other multidisciplinary domains might also struggle with data silofication, the learnings presented here may be transferrable to facilitate data sharing within other communities and support harmonization of approaches across disciplines to prepare the ground for cross-domain interoperability

    Harmonising knowledge for safer materials via the “NanoCommons” Knowledge Base

    Get PDF
    In mediaeval Europe, the term “commons” described the way that communities managed land that was held “in common” and provided a clear set of rules for how this “common land” was used and developed by, and for, the community. Similarly, as we move towards an increasingly knowledge-based society where data is the new oil, new approaches to sharing and jointly owning publicly funded research data are needed to maximise its added value. Such common management approaches will extend the data’s useful life and facilitate its reuse for a range of additional purposes, from modelling, to meta-analysis to regulatory risk assessment as examples relevant to nanosafety data. This “commons” approach to nanosafety data and nanoinformatics infrastructure provision, co-development, and maintenance is at the heart of the “NanoCommons” project and underpins its post-funding transition to providing a basis on which other initiatives and projects can build. The present paper summarises part of the NanoCommons infrastructure called the NanoCommons Knowledge Base. It provides interoperability for nanosafety data sources and tools, on both semantic and technical levels. The NanoCommons Knowledge Base connects knowledge and provides both programmatic (via an Application Programming Interface) and a user-friendly graphical interface to enable (and democratise) access to state of the art tools for nanomaterials safety prediction, NMs design for safety and sustainability, and NMs risk assessment, as well. In addition, the standards and interfaces for interoperability, e.g., file templates to contribute data to the NanoCommons, are described, and a snapshot of the range and breadth of nanoinformatics tools and models that have already been integrated are presented Finally, we demonstrate how the NanoCommons Knowledge Base can support users in the FAIRification of their experimental workflows and how the NanoCommons Knowledge Base itself has progressed towards richer compliance with the FAIR principles

    Synthesis and characterization of Zr- and Hf-doped nano-TiO<sub>2</sub> as internal standards for analytical quantification of nanomaterials in complex matrices

    Get PDF
    The reliable quantification of nanomaterials (NMs) in complex matrices such as food, cosmetics and biological and environmental compartments can be challenging due to interactions with matrix components and analytical equipment (vials and tubing). The resulting losses along the analytical process (sampling, extraction, clean-up, separation and detection) hamper the quantification of the target NMs in these matrices as well as the compatibility of results and meaningful interpretations in safety assessments. These issues can be overcome by the addition of known amounts of internal/recovery standards to the sample prior to analysis. These standards need to replicate the behaviour of target analytes in the analytical process, which is mainly defined by the surface properties. Moreover, they need to carry a tag that can be quantified independently of the target analyte. As inductively coupled plasma mass spectrometry is used for the identification and quantification of NMs, doping with isotopes, target analytes or with chemically related rare elements is a promising approach. We present the synthesis of a library of TiO2 NMs doped with hafnium (Hf) and zirconium (Zr) (both low in environmental abundance). Zirconia NMs doped with Hf were also synthesized to complement the library. NMs were synthesized with morphological and size properties similar to commercially available TiO2. Characterization included: transmission electron microscopy coupled with energy-dispersive X-ray spectroscopy, X-ray diffraction spectroscopy, Brunauer–Emmett–Teller total specific surface area analysis, cryofixation scanning electron microscopy, inductively coupled plasma optical emission spectroscopy and UV–visible spectrometry. The Ti: Hf and Ti:Zr ratios were verified and calculated using Rietveld refinement. The labelled NMs can serve as internal standards to track the extraction efficiency from complex matrices, and increase method robustness and traceability of characterization/quantification.</p
    corecore