45 research outputs found
Next-generation text-mining mediated generation of chemical response-specific gene sets for interpretation of gene expression data
Background: Availability of chemical response-specific lists of genes (gene sets) for pharmacological and/or toxic effect prediction for compounds is limited. We hypothesize that more gene sets can be created by next-generation text mining (next-gen TM), and that these can be used with gene set analysis (GSA) methods for chemical treatment identification, for pharmacological mechanism elucidation, and for comparing compound toxicity profiles. Methods. We created 30,211 chemical response-specific gene sets for human and mouse by next-gen TM, and derived 1,189 (human) and 588 (mouse) gene sets from the Comparative Toxicogenomics Database (CTD). We tested for significant differential expression (SDE) (false discovery rate -corrected p-values < 0.05) of the next-gen TM-derived gene sets and the CTD-derived gene sets in gene expression (GE) data sets of five chemicals (from experimental models). We tested for SDE of gene sets for six fibrates in a peroxisome proliferator-activated receptor alpha (PPARA) knock-out GE dataset and compared to results from the Connectivity Map. We tested for SDE of 319 next-gen TM-derived gene sets for environmental toxicants in three GE data sets of triazoles, and tested for SDE of 442 gene sets associated with embryonic structures. We compared the gene sets to triazole effects seen in the Whole Embryo Culture (WEC), and used principal component analysis (PCA) to discriminate triazoles from other chemicals. Results: Next-gen TM-derived gene sets matching the chemical treatment were significantly altered in three GE data sets, and the corresponding CTD-derived gene sets were significantly altered in five GE data sets. Six next-gen TM-derived and four CTD-derived fibrate gene sets were significantly altered in the PPARA knock-out GE dataset. None of the fibrate signatures in cMap scored significant against the PPARA GE signature. 33 environmental toxicant gene sets were significantly altered in the triazole GE data sets. 21 of these toxicants had a similar toxicity pattern as the triazoles. We confirmed embryotoxic effects, and discriminated triazoles from other chemicals. Conclusions: Gene set analysis with next-gen TM-derived chemical response-specific gene sets is a scalable method for identifying similarities in gene responses to other chemicals, from which one may infer potential mode of action and/or toxic effect
Delay and Impairment in Brain Development and Function in Rat Offspring After Maternal Exposure to Methylmercury
Maternal exposure to the neurotoxin methylmercury (MeHg) has been shown to have adverse effects on neural development of the offspring in man. Little is known about the underlying mechanisms by which MeHg affects the developing brain. To explore the neurodevelopmental defects and the underlying mechanism associated with MeHg exposure, the cerebellum and cerebrum of Wistar rat pups were analyzed by [F-18]FDG PET functional imaging, field potential analysis, and microarray gene expression profiling. Female rat pups were exposed to MeHg via maternal diet during intrauterinal and lactational period (from gestational day 6 to postnatal day (PND)10), and their brain tissues were sampled for the analysis at weaning (PND18-21) and adulthood (PND61-70). The [F-18]FDG PET imaging and field potential analysis suggested a delay in brain activity and impaired neural function by MeHg. Genome-wide transcriptome analysis substantiated these findings by showing (1) a delay in the onset of gene expression related to neural development, and (2) alterations in pathways related to both structural and functional aspects of nervous system development. The latter included changes in gene expression of developmental regulators, developmental phase associated genes, small GTPase signaling molecules, and representatives of all processes required for synaptic transmission. These findings were observed at dose levels at which only marginal changes in conventional developmental toxicity endpoints were detected. Therefore, the approaches applied in this study are promising in terms of yielding increased sensitivity compared with classical developmental toxicity tests
Application of AOPs to assist regulatory assessment of chemical risks - Case studies, needs and recommendations
While human regulatory risk assessment (RA) still largely relies on animal studies, new approach methodologies (NAMs) based on in vitro, in silico or non-mammalian alternative models are increasingly used to evaluate chemical hazards. Moreover, human epidemiological studies with biomarkers of effect (BoE) also play an invaluable role in identifying health effects associated with chemical exposures. To move towards the next generation risk assessment (NGRA), it is therefore crucial to establish bridges between NAMs and standard approaches, and to establish processes for increasing mechanistically-based biological plausibility in human studies. The Adverse Outcome Pathway (AOP) framework constitutes an important tool to address these needs but, despite a significant increase in knowledge and awareness, the use of AOPs in chemical RA remains limited. The objective of this paper is to address issues related to using AOPs in a regulatory context from various perspectives as it was discussed in a workshop organized within the European Union partnerships HBM4EU and PARC in spring 2022. The paper presents examples where the AOP framework has been proven useful for the human RA process, particularly in hazard prioritization and characterization, in integrated approaches to testing and assessment (IATA), and in the identification and validation of BoE in epidemiological studies. Nevertheless, several limitations were identified that hinder the optimal usability and acceptance of AOPs by the regulatory community including the lack of quantitative information on response-response relationships and of efficient ways to map chemical data (exposure and toxicity) onto AOPs. The paper summarizes suggestions, ongoing initiatives and third-party tools that may help to overcome these obstacles and thus assure better implementation of AOPs in the NGRA
ELIXIR and Toxicology: a community in development [version 2; peer review: 2 approved]
Toxicology has been an active research field for many decades, with academic, industrial and government involvement. Modern omics and computational approaches are changing the field, from merely disease-specific observational models into target-specific predictive models. Traditionally, toxicology has strong links with other fields such as biology, chemistry, pharmacology, and medicine. With the rise of synthetic and new engineered materials, alongside ongoing prioritisation needs in chemical risk assessment for existing chemicals, early predictive evaluations are becoming of utmost importance to both scientific and regulatory purposes. ELIXIR is an intergovernmental organisation that brings together life science resources from across Europe. To coordinate the linkage of various life science efforts around modern predictive toxicology, the establishment of a new ELIXIR Community is seen as instrumental. In the past few years, joint efforts, building on incidental overlap, have been piloted in the context of ELIXIR. For example, the EU-ToxRisk, diXa, HeCaToS, transQST, and the nanotoxicology community have worked with the ELIXIR TeSS, Bioschemas, and Compute Platforms and activities. In 2018, a core group of interested parties wrote a proposal, outlining a sketch of what this new ELIXIR Toxicology Community would look like. A recent workshop (held September 30th to October 1st, 2020) extended this into an ELIXIR Toxicology roadmap and a shortlist of limited investment-high gain collaborations to give body to this new community. This Whitepaper outlines the results of these efforts and defines our vision of the ELIXIR Toxicology Community and how it complements other ELIXIR activities
ELIXIR and toxicology
Toxicology has been an active research field for many decades, with academic, industrial and government involvement. Modern omics and computational approaches are changing the field, from merely disease-specific observational models into target-specific predictive models. Traditionally, toxicology has strong links with other fields such as biology, chemistry, pharmacology, and medicine. With the rise of synthetic and new engineered materials, alongside ongoing prioritisation needs in chemical risk assessment for existing chemicals, early predictive evaluations are becoming of utmost importance to both scientific and regulatory purposes. ELIXIR is an intergovernmental organisation that brings together life science resources from across Europe. To coordinate the linkage of various life science efforts around modern predictive toxicology, the establishment of a new ELIXIR Community is seen as instrumental. In the past few years, joint efforts, building on incidental overlap, have been piloted in the context of ELIXIR. For example, the EU-ToxRisk, diXa, HeCaToS, transQST, and the nanotoxicology community have worked with the ELIXIR TeSS, Bioschemas, and Compute Platforms and activities. In 2018, a core group of interested parties wrote a proposal, outlining a sketch of what this new ELIXIR Toxicology Community would look like. A recent workshop (held September 30th to October 1st, 2020) extended this into an ELIXIR Toxicology roadmap and a shortlist of limited investment-high gain collaborations to give body to this new community. This Whitepaper outlines the results of these efforts and defines our vision of the ELIXIR Toxicology Community and how it complements other ELIXIR activities
Estimation of the exposure response relation between benzene and acute myeloid leukemia by combining epidemiological, human biomarker, and animal data
Background Chemical risk assessment can benefit from integrating data across multiple evidence bases, especially in exposure-response cure (ERC) modelling when data across the exposure range is sparse. Methods We estimated the ERC for benzene and acute myeloid leukemia (AML), by fitting linear and spline-based Bayesian meta-regression models that included summary risk estimates from non-AML and non-human studies as prior information. Our complete dataset included six human AML studies, three human leukemia studies, ten human biomarker studies, and four experimental animal studies. Results A linear meta-regression model with intercept best predicted AML risks after cross-validation, both for the full dataset and AML studies only. Risk estimates in the low exposure range (<40 ppm yrs) from this model were comparable, but more precise, when the ERC was derived using all available data than when using AML data only. Allowing for between-study heterogeneity, RRs and 95% prediction intervals [95%PI] at 5 ppm years were 1.58 [1.01, 3.22]) and 1.44 [0.85, 3.42], respectively. Conclusions Integrating the available epidemiological, biomarker, and animal data resulted in more precise risk estimates for benzene exposure and AML, although the large between-study heterogeneity hampers interpretation of these results. The harmonization steps required to fit the Bayesian meta-regression model involve a range of assumptions that need to be critically evaluated, as they seem crucial for successful implementation. Impact By describing a framework for data-integration and explicitly describing the necessary data harmonization steps, we hope to enable risk assessors to better understand the advantages and assumptions underlying a data integration approach
Exposome-based public health interventions for infectious diseases in urban settings
The COVID-19 pandemic placed public health measures against infectious diseases at the core of global health challenges, especially in cities where more than half of the global population lives. SARS-CoV-2 is an exposure agent recently added to the network of exposures that comprise the human exposome, i.e. the totality of all environmental exposures throughout one's lifetime. At the same time, the application of measures to tackle SARS-CoV-2 transmission leads to changes in the exposome components and in characteristics of urban environments that define the urban exposome, a complementary concept to the human exposome, which focuses on monitoring urban health. This work highlights the use of a comprehensive systems-based approach of the exposome for better capturing the population-wide and individual-level variability in SARS-CoV-2 spread and its associated urban and individual exposures towards improved guidance and response. Population characteristics, the built environment and spatiotemporal features of city infrastructure, as well as individual characteristics/parameters, socioeconomic status, occupation and biological susceptibility need to be simultaneously considered when deploying non-pharmacological public health measures. Integrating individual and population characteristics, as well as urban-specific parameters is the prerequisite in urban exposome studies. Applications of the exposome approach in cities/towns could facilitate assessment of health disparities and better identification of vulnerable populations, as framed by multiple environmental, urban design and planning co-exposures. Exposome-based applications in epidemics control and response include the implementation of exposomic tools that have been quite mature in non-communicable disease research, ranging from biomonitoring and surveillance to sensors and modeling. Therefore, the exposome can be a novel tool in risk assessment and management during epidemics and other major public health events. This is a unique opportunity for the research community to exploit the exposome concept and its tools in upgrading and further developing site-specific public health measures in cities
A Scoping Review of Technologies and Their Applicability for Exposome-Based Risk Assessment in the Oil and Gas Industry
Introduction: Oil and gas workers have been shown to be at increased risk of chronic diseases including cancer, asthma, chronic obstructive pulmonary disease, and hearing loss, among others. Technological advances may be used to assess the external (e.g. personal sensors, smartphone apps and online platforms, exposure models) and internal exposome (e.g. physiologically based kinetic modeling (PBK), biomonitoring, omics), offering numerous possibilities for chronic disease prevention strategies and risk management measures. The objective of this study was to review the literature on these technologies, by focusing on: (i) evaluating their applicability for exposome research in the oil and gas industry, and (ii) identifying key challenges that may hamper the successful application of such technologies in the oil and gas industry. Method: A scoping review was conducted by identifying peer-reviewed literature with searches in MEDLINE/PubMed and SciVerse Scopus. Two assessors trained on the search strategy screened retrieved articles on title and abstract. The inclusion criteria used for this review were: Application of the aforementioned technologies at a workplace in the oil and gas industry or, application of these technologies for an exposure relevant to the oil and gas industry but in another occupational sector, English language and publication period 2005-end of 2019. Results: In total, 72 articles were included in this scoping review with most articles focused on omics and bioinformatics (N = 22), followed by biomonitoring and biomarkers (N = 20), external exposure modeling (N = 11), PBK modeling (N = 10), and personal sensors (N = 9). Several studies were identified in the oil and gas industry on the application of PBK models and biomarkers, mainly focusing on workers exposed to benzene. The application of personal sensors, new types of exposure models, and omics technology are still in their infancy with respect to the oil and gas industry. Nevertheless, applications of these technologies in other occupational sectors showed the potential for application in this sector. Discussion and conclusion: New exposome technologies offer great promise for personal monitoring of workers in the oil and gas industry, but more applied research is needed in collaboration with the industry. Current challenges hindering a successful application of such technologies include (i) the technological readiness of sensors, (ii) the availability of data, (iii) the absence of standardized and validated methods, and (iv) the need for new study designs to study the development of disease during working life