2,837 research outputs found

    eLetter on: Global Health Inequities in Rheumatology

    Get PDF

    Identification of novel modifiers of Aβ toxicity by transcriptomic analysis in the fruitfly.

    Get PDF
    The strongest risk factor for developing Alzheimer's Disease (AD) is age. Here, we study the relationship between ageing and AD using a systems biology approach that employs a Drosophila (fruitfly) model of AD in which the flies overexpress the human Aβ42 peptide. We identified 712 genes that are differentially expressed between control and Aβ-expressing flies. We further divided these genes according to how they change over the animal's lifetime and discovered that the AD-related gene expression signature is age-independent. We have identified a number of differentially expressed pathways that are likely to play an important role in the disease, including oxidative stress and innate immunity. In particular, we uncovered two new modifiers of the Aβ phenotype, namely Sod3 and PGRP-SC1b

    Therapeutic limitations in tumor-specific CD8+ memory T cell engraftment

    Get PDF
    BACKGROUND: Adoptive immunotherapy with cytotoxic T lymphocytes (CTL) represents an alternative approach to treating solid tumors. Ideally, this would confer long-term protection against tumor. We previously demonstrated that in vitro-generated tumor-specific CTL from the ovalbumin (OVA)-specific OT-I T cell receptor transgenic mouse persisted long after adoptive transfer as memory T cells. When recipient mice were challenged with the OVA-expressing E.G7 thymoma, tumor growth was delayed and sometimes prevented. The reasons for therapeutic failures were not clear. METHODS: OT-I CTL were adoptively transferred to C57BL/6 mice 21 – 28 days prior to tumor challenge. At this time, the donor cells had the phenotypical and functional characteristics of memory CD8+ T cells. Recipients which developed tumor despite adoptive immunotherapy were analyzed to evaluate the reason(s) for therapeutic failure. RESULTS: Dose-response studies demonstrated that the degree of tumor protection was directly proportional to the number of OT-I CTL adoptively transferred. At a low dose of OT-I CTL, therapeutic failure was attributed to insufficient numbers of OT-I T cells that persisted in vivo, rather than mechanisms that actively suppressed or anergized the OT-I T cells. In recipients of high numbers of OT-I CTL, the E.G7 tumor that developed was shown to be resistant to fresh OT-I CTL when examined ex vivo. Furthermore, these same tumor cells no longer secreted a detectable level of OVA. In this case, resistance to immunotherapy was secondary to selection of clones of E.G7 that expressed a lower level of tumor antigen. CONCLUSIONS: Memory engraftment with tumor-specific CTL provides long-term protection against tumor. However, there are several limitations to this immunotherapeutic strategy, especially when targeting a single antigen. This study illustrates the importance of administering large numbers of effectors to engraft sufficiently efficacious immunologic memory. It also demonstrates the importance of targeting several antigens when developing vaccine strategies for cancer

    The Impact of Global Warming and Anoxia on Marine Benthic Community Dynamics: an Example from the Toarcian (Early Jurassic)

    Get PDF
    The Pliensbachian-Toarcian (Early Jurassic) fossil record is an archive of natural data of benthic community response to global warming and marine long-term hypoxia and anoxia. In the early Toarcian mean temperatures increased by the same order of magnitude as that predicted for the near future; laminated, organic-rich, black shales were deposited in many shallow water epicontinental basins; and a biotic crisis occurred in the marine realm, with the extinction of approximately 5% of families and 26% of genera. High-resolution quantitative abundance data of benthic invertebrates were collected from the Cleveland Basin (North Yorkshire, UK), and analysed with multivariate statistical methods to detect how the fauna responded to environmental changes during the early Toarcian. Twelve biofacies were identified. Their changes through time closely resemble the pattern of faunal degradation and recovery observed in modern habitats affected by anoxia. All four successional stages of community structure recorded in modern studies are recognised in the fossil data (i.e. Stage III: climax; II: transitional; I: pioneer; 0: highly disturbed). Two main faunal turnover events occurred: (i) at the onset of anoxia, with the extinction of most benthic species and the survival of a few adapted to thrive in low-oxygen conditions (Stages I to 0) and (ii) in the recovery, when newly evolved species colonized the re-oxygenated soft sediments and the path of recovery did not retrace of pattern of ecological degradation (Stages I to II). The ordination of samples coupled with sedimentological and palaeotemperature proxy data indicate that the onset of anoxia and the extinction horizon coincide with both a rise in temperature and sea level. Our study of how faunal associations co-vary with long and short term sea level and temperature changes has implications for predicting the long-term effects of “dead zones” in modern oceans

    One-carbon metabolism in cancer

    Get PDF
    Cells require one-carbon units for nucleotide synthesis, methylation and reductive metabolism, and these pathways support the high proliferative rate of cancer cells. As such, anti-folates, drugs that target one-carbon metabolism, have long been used in the treatment of cancer. Amino acids, such as serine are a major one-carbon source, and cancer cells are particularly susceptible to deprivation of one-carbon units by serine restriction or inhibition of de novo serine synthesis. Recent work has also begun to decipher the specific pathways and sub-cellular compartments that are important for one-carbon metabolism in cancer cells. In this review we summarise the historical understanding of one-carbon metabolism in cancer, describe the recent findings regarding the generation and usage of one-carbon units and explore possible future therapeutics that could exploit the dependency of cancer cells on one-carbon metabolism

    High (but Not Low) Urinary Iodine Excretion Is Predicted by Iodine Excretion Levels from Five Years Ago

    Get PDF
    Background: It has not been investigated whether there are associations between urinary iodine (UI) excretion measurements some years apart, nor whether such an association remains after adjustment for nutritional habits. The aim of the present study was to investigate the relation between iodine-creatinine ratio (ICR) at two measuring points 5 years apart. Methods: Data from 2,659 individuals from the Study of Health in Pomerania were analyzed. Analysis of covariance and Poisson regressions were used to associate baseline with follow-up ICR. Results: Baseline ICR was associated with follow-up ICR. Particularly, baseline ICR >300 mu g/g was related to an ICR >300 mu g/g at follow-up (relative risk, RR: 2.20; p < 0.001). The association was stronger in males (RR: 2.64; p < 0.001) than in females (RR: 1.64; p = 0.007). In contrast, baseline ICR <100 mu g/g was only associated with an ICR <100 mu g/g at follow-up in males when considering unadjusted ICR. Conclusions: We detected only a weak correlation with respect to low ICR. Studies assessing iodine status in a population should take into account that an individual with a low UI excretion in one measurement is not necessarily permanently iodine deficient. On the other hand, current high ICR could have been predicted by high ICR 5 years ago. Copyright (C) 2011 S. Karger AG, Base

    Mining geriatric assessment data for in-patient fall prediction models and high-risk subgroups

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Hospital in-patient falls constitute a prominent problem in terms of costs and consequences. Geriatric institutions are most often affected, and common screening tools cannot predict in-patient falls consistently. Our objectives are to derive comprehensible fall risk classification models from a large data set of geriatric in-patients' assessment data and to evaluate their predictive performance (aim#1), and to identify high-risk subgroups from the data (aim#2).</p> <p>Methods</p> <p>A data set of n = 5,176 single in-patient episodes covering 1.5 years of admissions to a geriatric hospital were extracted from the hospital's data base and matched with fall incident reports (n = 493). A classification tree model was induced using the C4.5 algorithm as well as a logistic regression model, and their predictive performance was evaluated. Furthermore, high-risk subgroups were identified from extracted classification rules with a support of more than 100 instances.</p> <p>Results</p> <p>The classification tree model showed an overall classification accuracy of 66%, with a sensitivity of 55.4%, a specificity of 67.1%, positive and negative predictive values of 15% resp. 93.5%. Five high-risk groups were identified, defined by high age, low Barthel index, cognitive impairment, multi-medication and co-morbidity.</p> <p>Conclusions</p> <p>Our results show that a little more than half of the fallers may be identified correctly by our model, but the positive predictive value is too low to be applicable. Non-fallers, on the other hand, may be sorted out with the model quite well. The high-risk subgroups and the risk factors identified (age, low ADL score, cognitive impairment, institutionalization, polypharmacy and co-morbidity) reflect domain knowledge and may be used to screen certain subgroups of patients with a high risk of falling. Classification models derived from a large data set using data mining methods can compete with current dedicated fall risk screening tools, yet lack diagnostic precision. High-risk subgroups may be identified automatically from existing geriatric assessment data, especially when combined with domain knowledge in a hybrid classification model. Further work is necessary to validate our approach in a controlled prospective setting.</p

    Evolutionary Toggling of Vpx/Vpr Specificity Results in Divergent Recognition of the Restriction Factor SAMHD1

    Get PDF
    SAMHD1 is a host restriction factor that blocks the ability of lentiviruses such as HIV-1 to undergo reverse transcription in myeloid cells and resting T-cells. This restriction is alleviated by expression of the lentiviral accessory proteins Vpx and Vpr (Vpx/Vpr), which target SAMHD1 for proteasome-mediated degradation. However, the precise determinants within SAMHD1 for recognition by Vpx/Vpr remain unclear. Here we show that evolution of Vpx/Vpr in primate lentiviruses has caused the interface between SAMHD1 and Vpx/Vpr to alter during primate lentiviral evolution. Using multiple HIV-2 and SIV Vpx proteins, we show that Vpx from the HIV-2 and SIVmac lineage, but not Vpx from the SIVmnd2 and SIVrcm lineage, require the C-terminus of SAMHD1 for interaction, ubiquitylation, and degradation. On the other hand, the N-terminus of SAMHD1 governs interactions with Vpx from SIVmnd2 and SIVrcm, but has little effect on Vpx from HIV-2 and SIVmac. Furthermore, we show here that this difference in SAMHD1 recognition is evolutionarily dynamic, with the importance of the N- and C-terminus for interaction of SAMHD1 with Vpx and Vpr toggling during lentiviral evolution. We present a model to explain how the head-to-tail conformation of SAMHD1 proteins favors toggling of the interaction sites by Vpx/Vpr during this virus-host arms race. Such drastic functional divergence within a lentiviral protein highlights a novel plasticity in the evolutionary dynamics of viral antagonists for restriction factors during lentiviral adaptation to its hosts. © 2013 Fregoso et al
    corecore