266 research outputs found

    A nanoinformatics decision support tool for the virtual screening of gold nanoparticle cellular association using protein corona fingerprints

    Get PDF
    The increasing use of nanoparticles (NPs) in a wide range of consumer and industrial applications has necessitated significant effort to address the challenge of characterizing and quantifying the underlying nanostructure – biological response relationships to ensure that these novel materials can be exploited responsibly and safely. Such efforts demand reliable experimental data not only in terms of the biological dose-response, but also regarding the physicochemical properties of the NPs and their interaction with the biological environment. The latter has not been extensively studied, as a large surface to bind biological macromolecules is a unique feature of NPs that is not relevant for chemicals or pharmaceuticals, and thus only limited data have been reported in the literature quantifying the protein corona formed when NPs interact with a biological medium and linking this with NP cellular association/uptake. In this work we report the development of a predictive model for the assessment of the biological response (cellular association, which can include both internalized NPs and those attached to the cell surface) of surface-modified gold NPs, based on their physicochemical properties and protein corona fingerprints, utilizing a dataset of 105 unique NPs. Cellular association was chosen as the end-point for the original experimental study due to its relevance to inflammatory responses, biodistribution, and toxicity in vivo. The validated predictive model is freely available online through the Enalos Cloud Platform (http://enalos.insilicotox.com/NanoProteinCorona/) to be used as part of a regulatory or NP safe-by-design decision support system. This online tool will allow the virtual screening of NPs, based on a list of the significant NP descriptors, identifying those NPs that would warrant further toxicity testing on the basis of predicted NP cellular association.</p

    D10.6 - Final Version of NanoCommons Sustainability Plan

    Get PDF
    NanoCommons was funded as an infrastructure project for a starting community. This means that it was supposed to build the concepts and foundation on which the community can continue to build solutions and services; in the case of NanoCommons, the infrastructure goal was to address the starting community’s data and nanoinformatics needs. NanoCommons did not start entirely from scratch, as it was building on efforts of the Nanosafety Cluster’s Working Group F on data management, and benefited from a general appreciation of the value of data reuse and computational predictions in the community. The push towards increasing use of chemoinformatics and nanoinformatics approaches was also endorsed by the public, regulatory and funding agencies, including being accelerated by the European ban on animal testing in the cosmetics industry and the European Green Deal. Similarly, industry is increasingly acting as a driver: fostering implementation and adoption of data harmonisation, FAIRness (Findability, Accessibility, Interoperability and Reusability of data) and openness and recognising that these activities require targeted and centralised efforts, which were provided by NanoCommons. However, a starting community is just that: a start upon which the community can build, a coalescence point around which collective efforts can nucleate. Our journey is still at the earliest stages, and much is needed in terms of automation, tooling, and continued training and education to drive the mindset changes within the community to fully embed data management at the start of the data lifecycle. Sustained and continuous support will be needed to achieve sufficient levels of digitalisation, global adoption of reporting standards both in scientific and regulatory settings, and machine-readability and machine-actionable data, all of which will lead to better quality and reproducible research, and more trust in the data and understanding of its applicability and suitability for reuse thus enhancing the value of the data and knowledge generated. This starts with sustaining what we already have, which in our case is the NanoCommons Knowledge Infrastructure, the implemented services from NanoCommons, as well as other associated partners and projects, and the collaboration with other projects established beyond the borders of nanosafety research. The term sustainability can be described as “the ability to be maintained at a certain rate or level”. Applied to NanoCommons, this means that the services/tools/materials that were designed and developed during the project and are already being offered to support the nanosafety community will continue to be maintained and ideally further developed, beyond the end of the funded period of the project, ensuring future accessibility for users and potential customers. Since there will be no direct public funding for these services anymore (pending further applications via Horizon Europe for example), planning for sustainability and creation of a (not necessarily commercial) business model were started very early in the project as a central task of WP10 and possible options were continuously evaluated and adapted based on stakeholder feedback coming from surveys and, more importantly, from users of the starting infrastructure services and expertise who received support in the form of Transnational Access (TA) projects or as part of the Demonstration Cases (see deliverable reports D9.3 and D9.4 for details of the first and second round Demonstration Cases, respectively). Deliverable D10.6 presented here builds on the previous deliverables D10.4 “First Testing and Evaluation Results of NanoCommons Sustainability Plan” and D10.5 “Second Testing and Evaluation Results on the NanoCommons Sustainability Plan”, proposing the first version of the business model and analysing all project activities related to sustainability during the last period, respectively. Together, these three reports outline the considerations and activities undertaken with the aim of ensuring the sustained existence and utilisation of the NanoCommons project outcomes beyond the project lifetime. A major NanoCommons objective has been to achieve a sustainable and open knowledge infrastructure for the whole nanosafety community, and thus a considerable effort was invested in exploring the options and approaches, focussing on those business models consistent with the ethos of openness and accessibility, given the public funding used to develop the services, and the critical importance of access to Environmental Health and Safety (EHS) data globally. In this final deliverable, evaluation of the TAs and Demonstration Cases with respect to their (potential) contributions to the UN Sustainable Development Goals (SDGs) is completed by looking at the results from the third funding period. Additionally, the targeted activities with the strategic partners most of whom were previously identified as significant routes via which to sustain and further develop the NanoCommons tools and services, are summarised. The NanoCommons focus areas for short/long term sustainability are presented, along with the justifications of these choices. All of this information is then condensed into the final NanoCommons sustainability plan

    Development of a Robust Read-Across Model for the Prediction of Biological Potency of Novel Peroxisome Proliferator-Activated Receptor Delta Agonists

    Get PDF
    A robust predictive model was developed using 136 novel peroxisome proliferator-activated receptor delta (PPARδ) agonists, a distinct subtype of lipid-activated transcription factors of the nuclear receptor superfamily that regulate target genes by binding to characteristic sequences of DNA bases. The model employs various structural descriptors and docking calculations and provides predictions of the biological activity of PPARδ agonists, following the criteria of the Organization for Economic Co-operation and Development (OECD) for the development and validation of quantitative structure–activity relationship (QSAR) models. Specifically focused on small molecules, the model facilitates the identification of highly potent and selective PPARδ agonists and offers a read-across concept by providing the chemical neighbours of the compound under study. The model development process was conducted on Isalos Analytics Software (v. 0.1.17) which provides an intuitive environment for machine-learning applications. The final model was released as a user-friendly web tool and can be accessed through the Enalos Cloud platform’s graphical user interface (GUI)

    A safe-by-design tool for functionalised nanomaterials through the Enalos Nanoinformatics Cloud platform

    Get PDF
    Multi-walled carbon nanotubes are currently used in numerous industrial applications and products, therefore fast and accurate evaluation of their biological and toxicological effects is of utmost importance. Computational methods and techniques, previously applied in the area of cheminformatics for the prediction of adverse effects of chemicals, can also be applied in the case of nanomaterials (NMs), in an effort to reduce expensive and time consuming experimental procedures. In this context, a validated and predictive nanoinformatics model has been developed for the accurate prediction of the biological and toxicological profile of decorated multi-walled carbon nanotubes. The nanoinformatics workflow was fully validated according to the OECD principles before it was released online via the Enalos Cloud platform. The web-service is a ready-to-use, user-friendly application whose purpose is to facilitate decision making, as part of a safe-by-design framework for novel carbon nanotubes

    Advances in De Novo Drug Design : From Conventional to Machine Learning Methods

    Get PDF
    De novo drug design is a computational approach that generates novel molecular structures from atomic building blocks with no a priori relationships. Conventional methods include structure-based and ligand-based design, which depend on the properties of the active site of a biological target or its known active binders, respectively. Artificial intelligence, including ma-chine learning, is an emerging field that has positively impacted the drug discovery process. Deep reinforcement learning is a subdivision of machine learning that combines artificial neural networks with reinforcement-learning architectures. This method has successfully been em-ployed to develop novel de novo drug design approaches using a variety of artificial networks including recurrent neural networks, convolutional neural networks, generative adversarial networks, and autoencoders. This review article summarizes advances in de novo drug design, from conventional growth algorithms to advanced machine-learning methodologies and high-lights hot topics for further development.Peer reviewe

    Manually curated transcriptomics data collection for toxicogenomic assessment of engineered nanomaterials

    Get PDF
    Toxicogenomics (TGx) approaches are increasingly applied to gain insight into the possible toxicity mechanisms of engineered nanomaterials (ENMs). Omics data can be valuable to elucidate the mechanism of action of chemicals and to develop predictive models in toxicology. While vast amounts of transcriptomics data from ENM exposures have already been accumulated, a unified, easily accessible and reusable collection of transcriptomics data for ENMs is currently lacking. In an attempt to improve the FAIRness of already existing transcriptomics data for ENMs, we curated a collection of homogenized transcriptomics data from human, mouse and rat ENM exposures in vitro and in vivo including the physicochemical characteristics of the ENMs used in each study.Peer reviewe

    In silico assessment of nanoparticle toxicity powered by the Enalos Cloud Platform:Integrating automated machine learning and synthetic data for enhanced nanosafety evaluation

    Get PDF
    The rapid advance of nanotechnology has led to the development and widespread application of nanomaterials, raising concerns regarding their potential adverse effects on human health and the environment. Traditional (experimental) methods for assessing the nanoparticles (NPs) safety are time-consuming, expensive, and resource-intensive, and raise ethical concerns due to their reliance on animals. To address these challenges, we propose an in silico workflow that serves as an alternative or complementary approach to conventional hazard and risk assessment strategies, which incorporates state-of-the-art computational methodologies. In this study we present an automated machine learning (autoML) scheme that employs dose-response toxicity data for silver (Ag), titanium dioxide (TiO2), and copper oxide (CuO) NPs. This model is further enriched with atomistic descriptors to capture the NPs’ underlying structural properties. To overcome the issue of limited data availability, synthetic data generation techniques are used. These techniques help in broadening the dataset, thus improving the representation of different NP classes. A key aspect of this approach is a novel three-step applicability domain method (which includes the development of a local similarity approach) that enhances user confidence in the results by evaluating the prediction's reliability. We anticipate that this approach will significantly expedite the nanosafety assessment process enabling regulation to keep pace with innovation, and will provide valuable insights for the design and development of safe and sustainable NPs. The ML model developed in this study is made available to the scientific community as an easy-to-use web-service through the Enalos Cloud Platform (www.enaloscloud.novamechanics.com/sabydoma/safenanoscope/), facilitating broader access and collaborative advancements in nanosafety.</p
    • …
    corecore