27 research outputs found

    Nanocuration workflows: Establishing best practices for identifying, inputting, and sharing data to inform decisions on nanomaterials

    Get PDF
    There is a critical opportunity in the field of nanoscience to compare and integrate information across diverse fields of study through informatics (i.e., nanoinformatics). This paper is one in a series of articles on the data curation process in nanoinformatics (nanocuration). Other articles in this series discuss key aspects of nanocuration (temporal metadata, data completeness, database integration), while the focus of this article is on the nanocuration workflow, or the process of identifying, inputting, and reviewing nanomaterial data in a data repository. In particular, the article discusses: 1) the rationale and importance of a defined workflow in nanocuration, 2) the influence of organizational goals or purpose on the workflow, 3) established workflow practices in other fields, 4) current workflow practices in nanocuration, 5) key challenges for workflows in emerging fields like nanomaterials, 6) examples to make these challenges more tangible, and 7) recommendations to address the identified challenges. Throughout the article, there is an emphasis on illustrating key concepts and current practices in the field. Data on current practices in the field are from a group of stakeholders active in nanocuration. In general, the development of workflows for nanocuration is nascent, with few individuals formally trained in data curation or utilizing available nanocuration resources (e.g., ISA-TAB-Nano). Additional emphasis on the potential benefits of cultivating nanomaterial data via nanocuration processes (e.g., capability to analyze data from across research groups) and providing nanocuration resources (e.g., training) will likely prove crucial for the wider application of nanocuration workflows in the scientific community

    How should the completeness and quality of curated nanomaterial data be evaluated

    Get PDF
    Nanotechnology is of increasing significance. Curation of nanomaterial data into electronic databases offers opportunities to better understand and predict nanomaterials’ behaviour. This supports innovation in, and regulation of, nanotechnology. It is commonly understood that curated data need to be sufficiently complete and of sufficient quality to serve their intended purpose. However, assessing data completeness and quality is non-trivial in general and is arguably especially difficult in the nanoscience area, given its highly multidisciplinary nature. The current article, part of the Nanomaterial Data Curation Initiative series, addresses how to assess the completeness and quality of (curated) nanomaterial data. In order to address this key challenge, a variety of related issues are discussed: the meaning and importance of data completeness and quality, existing approaches to their assessment and the key challenges associated with evaluating the completeness and quality of curated nanomaterial data. Considerations which are specific to the nanoscience area and lessons which can be learned from other relevant scientific disciplines are considered. Hence, the scope of this discussion ranges from physicochemical characterisation requirements for nanomaterials and interference of nanomaterials with nanotoxicology assays to broader issues such as minimum information checklists, toxicology data quality schemes and computational approaches that facilitate evaluation of the completeness and quality of (curated) data. This discussion is informed by a literature review and a survey of key nanomaterial data curation stakeholders. Finally, drawing upon this discussion, recommendations are presented concerning the central question: how should the completeness and quality of curated nanomaterial data be evaluated

    How should the completeness and quality of curated nanomaterial data be evaluated?

    Get PDF
    Nanotechnology is of increasing significance. Curation of nanomaterial data into electronic databases offers opportunities to better understand and predict nanomaterials' behaviour. This supports innovation in, and regulation of, nanotechnology. It is commonly understood that curated data need to be sufficiently complete and of sufficient quality to serve their intended purpose. However, assessing data completeness and quality is non-trivial in general and is arguably especially difficult in the nanoscience area, given its highly multidisciplinary nature. The current article, part of the Nanomaterial Data Curation Initiative series, addresses how to assess the completeness and quality of (curated) nanomaterial data. In order to address this key challenge, a variety of related issues are discussed: the meaning and importance of data completeness and quality, existing approaches to their assessment and the key challenges associated with evaluating the completeness and quality of curated nanomaterial data. Considerations which are specific to the nanoscience area and lessons which can be learned from other relevant scientific disciplines are considered. Hence, the scope of this discussion ranges from physicochemical characterisation requirements for nanomaterials and interference of nanomaterials with nanotoxicology assays to broader issues such as minimum information checklists, toxicology data quality schemes and computational approaches that facilitate evaluation of the completeness and quality of (curated) data. This discussion is informed by a literature review and a survey of key nanomaterial data curation stakeholders. Finally, drawing upon this discussion, recommendations are presented concerning the central question: how should the completeness and quality of curated nanomaterial data be evaluated

    Framing and Assessing Environmental Risks of Nanomaterials

    No full text
    <p>Nanomaterials are being increasingly produced and used across a myriad of applications while their novel properties are still in the midst of being designed and explored. Thus the full implications of introducing these materials into the environment cannot be understood, yet the need to assess potential risks is already upon us. This work discusses a comprehensive view of environmental impact with respect to material flows from across the value chain into all compartments of the environment, whereby interactions and potential hazardous effects become possible. A subset of this broad system is then chosen for evaluation; a model is derived to describe the fate of nanomaterials released to wastewater. </p><p>This analysis considers the wastewater treatment plant (WWTP) as a complete mixed reactor aerobic secondary clarifier, and predicts whether nanomaterials will associate with effluent or sludge to project potential concentrations in each. The concentration of nanomaterials reaching a WWTP is estimated based on a linear weighting of total production, and the fate of nanomaterials within the WWTP is based on a characteristic inherent to the material, partition coefficient, and on design parameters of the WWTP, such as retention times and suspended solids concentration. </p><p>Due to the uncertainty inherent to this problem, a probabilistic approach is employed. Monte Carlo simulation is used, sampling from probability distributions assigned to each of the input parameters to calculate a distribution for the predicted concentrations in sludge and effluent. Input parameter distributions are estimated from values reported in the literature where possible. Where data do not yet exist, studies are carried out to enable parameter estimation. In particular, nanomaterial production is investigated to provide a basis to estimate the magnitude of potential exposure. Nanomaterial partitioning behavior is also studied in this work, through laboratory experiments for several types of nano-silver. </p><p>The results presented here illustrate the use of nanomaterial inventory data in predicting environmentally relevant concentrations. Estimates of effluent and sludge concentrations for nano-silver with four different types coatings suggest that these surface treatments affect the removal efficiency; the same nanomaterial with different coatings may have different environmental fates. Effluent concentration estimates for C60 and nano-TiO2 suggest that these nanomaterials could already be present at problematic concentrations at current levels of annual production.</p><p>Estimates of environmentally relevant concentrations may aid in interpretation of nanotoxicology studies. These relative estimates are also useful in that they may help inform future decisions regarding where to dedicate resources for future research. Beyond attempting to estimate environmental concentrations of nanomaterials, this type of streamlined model allows the consideration of scenarios, focusing on what happens as various input parameters change. Production quantity and the fraction of this quantity that is released to wastewater are found to greatly influence the model estimates for wastewater effluent concentrations; in the case of wastewater sludge concentrations, the model is sensitive to those parameters in addition to solids retention time.</p>Dissertatio

    A functional assay-based strategy for nanomaterial risk forecasting

    No full text
    The study of nanomaterial impacts on environment, health and safety (nanoEHS) has been largely predicated on the assumption that exposure and hazard can be predicted from physical–chemical properties of nanomaterials. This approach is rooted in the view that nanoöbjects essentially resemble chemicals with additional particle-based attributes that must be included among their intrinsic physical–chemical descriptors. With the exception of the trivial case of nanomaterials made from toxic or highly reactive materials, this approach has yielded few actionable guidelines for predicting nanomaterial risk. This article addresses inherent problems in structuring a nanoEHS research strategy based on the goal of predicting outcomes directly from nanomaterial properties, and proposes a framework for organizing data and designing integrated experiments based on functional assays (FAs). FAs are intermediary, semi-empirical measures of processes or functions within a specified system that bridge the gap between nanomaterial properties and potential outcomes in complex systems. The three components of a functional assay are standardized protocols for parameter determination and reporting, a theoretical context for parameter application and reference systems. We propose the identification and adoption of reference systems where FAs may be applied to provide parameter estimates for environmental fate and effects models, as well as benchmarks for comparing the results of FAs and experiments conducted in more complex and varied systems. Surface affinity and dissolution rate are identified as two critical FAs for characterizing nanomaterial behavior in a variety of important systems. The use of these FAs to predict bioaccumulation and toxicity for initial and aged nanomaterials is illustrated for the case of silver nanoparticles and Caenorhabditis elegans

    The Nanomaterial Data Curation Initiative: A collaborative approach to assessing, evaluating, and advancing the state of the field

    No full text
    The Nanomaterial Data Curation Initiative (NDCI), a project of the National Cancer Informatics Program Nanotechnology Working Group (NCIP NanoWG), explores the critical aspect of data curation within the development of informatics approaches to understanding nanomaterial behavior. Data repositories and tools for integrating and interrogating complex nanomaterial datasets are gaining widespread interest, with multiple projects now appearing in the US and the EU. Even in these early stages of development, a single common aspect shared across all nanoinformatics resources is that data must be curated into them. Through exploration of sub-topics related to all activities necessary to enable, execute, and improve the curation process, the NDCI will provide a substantive analysis of nanomaterial data curation itself, as well as a platform for multiple other important discussions to advance the field of nanoinformatics. This article outlines the NDCI project and lays the foundation for a series of papers on nanomaterial data curation. The NDCI purpose is to: 1) present and evaluate the current state of nanomaterial data curation across the field on multiple specific data curation topics, 2) propose ways to leverage and advance progress for both individual efforts and the nanomaterial data community as a whole, and 3) provide opportunities for similar publication series on the details of the interactive needs and workflows of data customers, data creators, and data analysts. Initial responses from stakeholder liaisons throughout the nanoinformatics community reveal a shared view that it will be critical to focus on integration of datasets with specific orientation toward the purposes for which the individual resources were created, as well as the purpose for integrating multiple resources. Early acknowledgement and undertaking of complex topics such as uncertainty, reproducibility, and interoperability is proposed as an important path to addressing key challenges within the nanomaterial community, such as reducing collateral negative impacts and decreasing the time from development to market for this new class of technologies

    Knowledge and Instance Mapping: architecture for premeditated interoperability of disparate data for materials

    No full text
    Abstract Predicting and elucidating the impacts of materials on human health and the environment is an unending task that has taken on special significance in the context of nanomaterials research over the last two decades. The properties of materials in environmental and physiological media are dynamic, reflecting the complex interactions between materials and these media. This dynamic behavior requires special consideration in the design of databases and data curation that allow for subsequent comparability and interrogation of the data from potentially diverse sources. We present two data processing methods that can be integrated into the experimental process to encourage pre-mediated interoperability of disparate material data: Knowledge Mapping and Instance Mapping. Originally developed as a framework for the NanoInformatics Knowledge Commons (NIKC) database, this architecture and associated methods can be used independently of the NIKC and applied across multiple subfields of nanotechnology and material science
    corecore