148 research outputs found

    Depression after low-energy fracture in older women predicts future falls: a prospective observational study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Falls are one of the main causes of fractures in elderly people and after a recent fracture, the risk of another fall is increased, resulting in subsequent fracture. Therefore, risk factors for future falls should be determined. We prospectively investigated the relationship between depression and the incidence of falls in post-menopausal women after a low-energy fracture.</p> <p>Methods</p> <p>At baseline, 181 women aged 60 years and older who presented with a recent low-energy fracture were evaluated at the fracture and osteoporosis outpatient clinics of two hospitals. As well as clinical evaluation and bone mineral density tests, the presence of depression (measured using the Edinburgh Depression Scale, EDS, depression cut-off > 11) and risk factors for falling were assessed. During two years of follow-up, the incidence of falls was registered annually by means of detailed questionnaires and interviews.</p> <p>Results</p> <p>Seventy-nine (44%) of the women sustained at least one fall during follow-up. Of these, 28% (<it>n </it>= 22) suffered from depression at baseline compared to 10% (<it>n </it>= 10) of the 102 women who did not sustain a fall during follow-up (<it>Χ</it><sup>2 </sup>= 8.76, df = 1, <it>p </it>= .003). Multiple logistic regression showed that the presence of depression and co-morbidity at baseline were independently related to falls (OR = 4.13, 95% CI = 1.58-10.80; OR = 2.25, 95% CI = 1.11-4.56, respectively) during follow-up.</p> <p>Conclusions</p> <p>The presence of depression in women aged 60 years and older with recent low-energy fractures is an important risk factor for future falls. We propose that clinicians treating patients with recent low-energy fractures should anticipate not only on skeletal-related risk factors for fractures, but also on fall-related risk factors including depression.</p

    Maize RNA PolIV affects the expression of genes with nearby TE insertions and has a genome-wide repressive impact on transcription

    Get PDF
    Abstract Background RNA-directed DNA methylation (RdDM) is a plant-specific epigenetic process that relies on the RNA polymerase IV (Pol IV) for the production of 24 nucleotide small interfering RNAs (siRNA) that guide the cytosine methylation and silencing of genes and transposons. Zea mays RPD1/RMR6 gene encodes the largest subunit of Pol IV and is required for normal plant development, paramutation, transcriptional repression of certain transposable elements (TEs) and transcriptional regulation of specific alleles. Results In this study we applied a total RNA-Seq approach to compare the B73 and rpd1/rmr6 leaf transcriptomes. Although previous studies indicated that loss of siRNAs production in RdDM mutants provokes a strong loss of CHH DNA methylation but not massive gene or TEs transcriptional activation in both Arabidopsis and maize, our total RNA-Seq analysis of rpd1/rmr6 transcriptome reveals that loss of Pol IV activity causes a global increase in the transcribed fraction of the maize genome. Our results point to the genes with nearby TE insertions as being the most strongly affected by Pol IV-mediated gene silencing. TEs modulation of nearby gene expression is linked to alternative methylation profiles on gene flanking regions, and these profiles are strictly dependent on specific characteristics of the TE member inserted. Although Pol IV is essential for the biogenesis of siRNAs, the genes with associated siRNA loci are less affected by the pol IV mutation. Conclusions This deep and integrated analysis of gene expression, TEs distribution, smallRNA targeting and DNA methylation levels, reveals that loss of Pol IV activity globally affects genome regulation, pointing at TEs as modulator of nearby gene expression and indicating the existence of multiple level epigenetic silencing mechanisms. Our results also suggest a predominant role of the Pol IV-mediated RdDM pathway in genome dominance regulation, and subgenome stability and evolution in maize

    Uptake of the Necrotic Serpin in Drosophila melanogaster via the Lipophorin Receptor-1

    Get PDF
    The humoral response to fungal and Gram-positive infections is regulated by the serpin-family inhibitor, Necrotic. Following immune-challenge, a proteolytic cascade is activated which signals through the Toll receptor. Toll activation results in a range of antibiotic peptides being synthesised in the fat-body and exported to the haemolymph. As with mammalian serpins, Necrotic turnover in Drosophila is rapid. This serpin is synthesised in the fat-body, but its site of degradation has been unclear. By “freezing” endocytosis with a temperature sensitive Dynamin mutation, we demonstrate that Necrotic is removed from the haemolymph in two groups of giant cells: the garland and pericardial athrocytes. Necrotic uptake responds rapidly to infection, being visibly increased after 30 mins and peaking at 6–8 hours. Co-localisation of anti-Nec with anti-AP50, Rab5, and Rab7 antibodies establishes that the serpin is processed through multi-vesicular bodies and delivered to the lysosome, where it co-localises with the ubiquitin-binding protein, HRS. Nec does not co-localise with Rab11, indicating that the serpin is not re-exported from athrocytes. Instead, mutations which block late endosome/lysosome fusion (dor, hk, and car) cause accumulation of Necrotic-positive endosomes, even in the absence of infection. Knockdown of the 6 Drosophila orthologues of the mammalian LDL receptor family with dsRNA identifies LpR1 as an enhancer of the immune response. Uptake of Necrotic from the haemolymph is blocked by a chromosomal deletion of LpR1. In conclusion, we identify the cells and the receptor molecule responsible for the uptake and degradation of the Necrotic serpin in Drosophila melanogaster. The scavenging of serpin/proteinase complexes may be a critical step in the regulation of proteolytic cascades

    Global mRNA Degradation during Lytic Gammaherpesvirus Infection Contributes to Establishment of Viral Latency

    Get PDF
    During a lytic gammaherpesvirus infection, host gene expression is severely restricted by the global degradation and altered 3′ end processing of mRNA. This host shutoff phenotype is orchestrated by the viral SOX protein, yet its functional significance to the viral lifecycle has not been elucidated, in part due to the multifunctional nature of SOX. Using an unbiased mutagenesis screen of the murine gammaherpesvirus 68 (MHV68) SOX homolog, we isolated a single amino acid point mutant that is selectively defective in host shutoff activity. Incorporation of this mutation into MHV68 yielded a virus with significantly reduced capacity for mRNA turnover. Unexpectedly, the MHV68 mutant showed little defect during the acute replication phase in the mouse lung. Instead, the virus exhibited attenuation at later stages of in vivo infections suggestive of defects in both trafficking and latency establishment. Specifically, mice intranasally infected with the host shutoff mutant accumulated to lower levels at 10 days post infection in the lymph nodes, failed to develop splenomegaly, and exhibited reduced viral DNA levels and a lower frequency of latently infected splenocytes. Decreased latency establishment was also observed upon infection via the intraperitoneal route. These results highlight for the first time the importance of global mRNA degradation during a gammaherpesvirus infection and link an exclusively lytic phenomenon with downstream latency establishment

    Climate simulations for 1880-2003 with GISS modelE

    Get PDF
    We carry out climate simulations for 1880-2003 with GISS modelE driven by ten measured or estimated climate forcings. An ensemble of climate model runs is carried out for each forcing acting individually and for all forcing mechanisms acting together. We compare side-by-side simulated climate change for each forcing, all forcings, observations, unforced variability among model ensemble members, and, if available, observed variability. Discrepancies between observations and simulations with all forcings are due to model deficiencies, inaccurate or incomplete forcings, and imperfect observations. Although there are notable discrepancies between model and observations, the fidelity is sufficient to encourage use of the model for simulations of future climate change. By using a fixed well-documented model and accurately defining the 1880-2003 forcings, we aim to provide a benchmark against which the effect of improvements in the model, climate forcings, and observations can be tested. Principal model deficiencies include unrealistically weak tropical El Nino-like variability and a poor distribution of sea ice, with too much sea ice in the Northern Hemisphere and too little in the Southern Hemisphere. The greatest uncertainties in the forcings are the temporal and spatial variations of anthropogenic aerosols and their indirect effects on clouds.Comment: 44 pages; 19 figures; Final text accepted by Climate Dynamic

    Agricultural Management and Climatic Change Are the Major Drivers of Biodiversity Change in the UK

    Get PDF
    Action to reduce anthropogenic impact on the environment and species within it will be most effective when targeted towards activities that have the greatest impact on biodiversity. To do this effectively we need to better understand the relative importance of different activities and how they drive changes in species’ populations. Here, we present a novel, flexible framework that reviews evidence for the relative importance of these drivers of change and uses it to explain recent alterations in species’ populations. We review drivers of change across four hundred species sampled from a broad range of taxonomic groups in the UK. We found that species’ population change (~1970–2012) has been most strongly impacted by intensive management of agricultural land and by climatic change. The impact of the former was primarily deleterious, whereas the impact of climatic change to date has been more mixed. Findings were similar across the three major taxonomic groups assessed (insects, vascular plants and vertebrates). In general, the way a habitat was managed had a greater impact than changes in its extent, which accords with the relatively small changes in the areas occupied by different habitats during our study period, compared to substantial changes in habitat management. Of the drivers classified as conservation measures, low-intensity management of agricultural land and habitat creation had the greatest impact. Our framework could be used to assess the relative importance of drivers at a range of scales to better inform our policy and management decisions. Furthermore, by scoring the quality of evidence, this framework helps us identify research gaps and needs
    corecore