14 research outputs found

    Metadata stewardship in nanosafety research: learning from the past, preparing for an "on-the-fly" FAIR future

    Get PDF
    Introduction: Significant progress has been made in terms of best practice in research data management for nanosafety. Some of the underlying approaches to date are, however, overly focussed on the needs of specific research projects or aligned to a single data repository, and this “silo” approach is hampering their general adoption by the broader research community and individual labs. Methods: State-of-the-art data/knowledge collection, curation management FAIRification, and sharing solutions applied in the nanosafety field are reviewed focusing on unique features, which should be generalised and integrated into a functional FAIRification ecosystem that addresses the needs of both data generators and data (re)users. Results: The development of data capture templates has focussed on standardised single-endpoint Test Guidelines, which does not reflect the complexity of real laboratory processes, where multiple assays are interlinked into an overall study, and where non-standardised assays are developed to address novel research questions and probe mechanistic processes to generate the basis for read-across from one nanomaterial to another. By focussing on the needs of data providers and data users, we identify how existing tools and approaches can be re-framed to enable “on-the-fly” (meta) data definition, data capture, curation and FAIRification, that are sufficiently flexible to address the complexity in nanosafety research, yet harmonised enough to facilitate integration of datasets from different sources generated for different research purposes. By mapping the available tools for nanomaterials safety research (including nanomaterials characterisation, non-standard (mechanistic-focussed) methods, measurement principles and experimental setup, environmental fate and requirements from new research foci such as safe and sustainable by design), a strategy for integration and bridging between silos is presented. The NanoCommons KnowledgeBase has shown how data from different sources can be integrated into a one-stop shop for searching, browsing and accessing data (without copying), and thus how to break the boundaries between data silos. Discussion: The next steps are to generalise the approach by defining a process to build consensus (meta)data standards, develop solutions to make (meta)data more machine actionable (on the fly ontology development) and establish a distributed FAIR data ecosystem maintained by the community beyond specific projects. Since other multidisciplinary domains might also struggle with data silofication, the learnings presented here may be transferable to facilitate data sharing within other communities and support harmonization of approaches across disciplines to prepare the ground for cross-domain interoperability. Visit WorldFAIR online at http://worldfair-project.eu. WorldFAIR is funded by the EC HORIZON-WIDERA-2021-ERA-01-41 Coordination and Support Action under Grant Agreement No. 101058393

    Metadata stewardship in nanosafety research: learning from the past, preparing for an "on-the-fly" FAIR future

    Get PDF
    Introduction: Significant progress has been made in terms of best practice in research data management for nanosafety. Some of the underlying approaches to date are, however, overly focussed on the needs of specific research projects or aligned to a single data repository, and this "silo" approach is hampering their general adoption by the broader research community and individual labs.Methods: State-of-the-art data/knowledge collection, curation management FAIrification, and sharing solutions applied in the nanosafety field are reviewed focusing on unique features, which should be generalised and integrated into a functional FAIRification ecosystem that addresses the needs of both data generators and data (re)users.Results: The development of data capture templates has focussed on standardised single-endpoint Test Guidelines, which does not reflect the complexity of real laboratory processes, where multiple assays are interlinked into an overall study, and where non-standardised assays are developed to address novel research questions and probe mechanistic processes to generate the basis for read-across from one nanomaterial to another. By focussing on the needs of data providers and data users, we identify how existing tools and approaches can be re-framed to enable "on-the-fly" (meta) data definition, data capture, curation and FAIRification, that are sufficiently flexible to address the complexity in nanosafety research, yet harmonised enough to facilitate integration of datasets from different sources generated for different research purposes. By mapping the available tools for nanomaterials safety research (including nanomaterials characterisation, nonstandard (mechanistic-focussed) methods, measurement principles and experimental setup, environmental fate and requirements from new research foci such as safe and sustainable by design), a strategy for integration and bridging between silos is presented. The NanoCommons KnowledgeBase has shown how data from different sources can be integrated into a one-stop shop for searching, browsing and accessing data (without copying), and thus how to break the boundaries between data silos.Discussion: The next steps are to generalise the approach by defining a process to build consensus (meta)data standards, develop solutions to make (meta)data more machine actionable (on the fly ontology development) and establish a distributed FAIR data ecosystem maintained by the community beyond specific projects. Since other multidisciplinary domains might also struggle with data silofication, the learnings presented here may be transferrable to facilitate data sharing within other communities and support harmonization of approaches across disciplines to prepare the ground for cross-domain interoperability

    Metadata stewardship in nanosafety research: learning from the past, preparing for an “on-the-fly” FAIR future

    Get PDF
    Introduction: Significant progress has been made in terms of best practice in research data management for nanosafety. Some of the underlying approaches to date are, however, overly focussed on the needs of specific research projects or aligned to a single data repository, and this “silo” approach is hampering their general adoption by the broader research community and individual labs.Methods: State-of-the-art data/knowledge collection, curation management FAIrification, and sharing solutions applied in the nanosafety field are reviewed focusing on unique features, which should be generalised and integrated into a functional FAIRification ecosystem that addresses the needs of both data generators and data (re)users.Results: The development of data capture templates has focussed on standardised single-endpoint Test Guidelines, which does not reflect the complexity of real laboratory processes, where multiple assays are interlinked into an overall study, and where non-standardised assays are developed to address novel research questions and probe mechanistic processes to generate the basis for read-across from one nanomaterial to another. By focussing on the needs of data providers and data users, we identify how existing tools and approaches can be re-framed to enable “on-the-fly” (meta) data definition, data capture, curation and FAIRification, that are sufficiently flexible to address the complexity in nanosafety research, yet harmonised enough to facilitate integration of datasets from different sources generated for different research purposes. By mapping the available tools for nanomaterials safety research (including nanomaterials characterisation, nonstandard (mechanistic-focussed) methods, measurement principles and experimental setup, environmental fate and requirements from new research foci such as safe and sustainable by design), a strategy for integration and bridging between silos is presented. The NanoCommons KnowledgeBase has shown how data from different sources can be integrated into a one-stop shop for searching, browsing and accessing data (without copying), and thus how to break the boundaries between data silos.Discussion: The next steps are to generalise the approach by defining a process to build consensus (meta)data standards, develop solutions to make (meta)data more machine actionable (on the fly ontology development) and establish a distributed FAIR data ecosystem maintained by the community beyond specific projects. Since other multidisciplinary domains might also struggle with data silofication, the learnings presented here may be transferrable to facilitate data sharing within other communities and support harmonization of approaches across disciplines to prepare the ground for cross-domain interoperability

    Evaluation of Exposure Concentrations Used in Assessing Manufactured Nanomaterial Environmental Hazards: Are They Relevant?

    No full text
    Manufactured nanomaterials (MNMs) are increasingly produced and used in consumer goods, yet our knowledge regarding their environmental risks is limited. Environmental risks are assessed by characterizing exposure levels and biological receptor effects. As MNMs have rarely been quantified in environmental samples, our understanding of exposure level is limited. Absent direct measurements, environmental MNM concentrations are estimated from exposure modeling. Hazard, the potential for effects on biological receptors, is measured in the laboratory using a range of administered MNM concentrations. Yet concerns have been raised regarding the “relevancy” of hazard assessments, particularly when the administered MNM concentrations exceed those predicted to occur in the environment. What MNM concentrations are administered in hazard assessments and which are “environmentally relevant”? This review regards MNM concentrations in hazard assessments, from over 600 peer-reviewed articles published between 2008 and 2013. Some administered MNM concentrations overlap with, but many diverge from, predicted environmental concentrations. Other uncertainties influence the environmental relevance of current hazard assessments and exposure models, including test conditions, bioavailable concentrations, mode of action, MNM production volumes, and model validation. Therefore, it may be premature for MNM risk research to sanction information on the basis of concentration “environmental relevance”

    Evaluation of Exposure Concentrations Used in Assessing Manufactured Nanomaterial Environmental Hazards: Are They Relevant?

    No full text
    Manufactured nanomaterials (MNMs) are increasingly produced and used in consumer goods, yet our knowledge regarding their environmental risks is limited. Environmental risks are assessed by characterizing exposure levels and biological receptor effects. As MNMs have rarely been quantified in environmental samples, our understanding of exposure level is limited. Absent direct measurements, environmental MNM concentrations are estimated from exposure modeling. Hazard, the potential for effects on biological receptors, is measured in the laboratory using a range of administered MNM concentrations. Yet concerns have been raised regarding the “relevancy” of hazard assessments, particularly when the administered MNM concentrations exceed those predicted to occur in the environment. What MNM concentrations are administered in hazard assessments and which are “environmentally relevant”? This review regards MNM concentrations in hazard assessments, from over 600 peer-reviewed articles published between 2008 and 2013. Some administered MNM concentrations overlap with, but many diverge from, predicted environmental concentrations. Other uncertainties influence the environmental relevance of current hazard assessments and exposure models, including test conditions, bioavailable concentrations, mode of action, MNM production volumes, and model validation. Therefore, it may be premature for MNM risk research to sanction information on the basis of concentration “environmental relevance”

    A multi-stakeholder perspective on the use of alternative test strategies for nanomaterial safety assessment.

    No full text
    There has been a conceptual shift in toxicological studies from describing what happens to explaining how the adverse outcome occurs, thereby enabling a deeper and improved understanding of how biomolecular and mechanistic profiling can inform hazard identification and improve risk assessment. Compared to traditional toxicology methods, which have a heavy reliance on animals, new approaches to generate toxicological data are becoming available for the safety assessment of chemicals, including high-throughput and high-content screening (HTS, HCS). With the emergence of nanotechnology, the exponential increase in the total number of engineered nanomaterials (ENMs) in research, development, and commercialization requires a robust scientific approach to screen ENM safety in humans and the environment rapidly and efficiently. Spurred by the developments in chemical testing, a promising new toxicological paradigm for ENMs is to use alternative test strategies (ATS), which reduce reliance on animal testing through the use of in vitro and in silico methods such as HTS, HCS, and computational modeling. Furthermore, this allows for the comparative analysis of large numbers of ENMs simultaneously and for hazard assessment at various stages of the product development process and overall life cycle. Using carbon nanotubes as a case study, a workshop bringing together national and international leaders from government, industry, and academia was convened at the University of California, Los Angeles, to discuss the utility of ATS for decision-making analyses of ENMs. After lively discussions, a short list of generally shared viewpoints on this topic was generated, including a general view that ATS approaches for ENMs can significantly benefit chemical safety analysis
    corecore