37 research outputs found

    Role of senescence marker p16INK4a measured in peripheral blood T-lymphocytes in predicting length of hospital stay after coronary artery bypass surgery in older adults

    Get PDF
    Adults older than 65 years undergo more than 120,000 coronary artery bypass (CAB) procedures each year in the United States. Chronological age alone, though commonly used in prediction models of outcomes after CAB, does not alone reflect variability in aging process; thus, the risk of complications in older adults. We performed a prospective study to evaluate a relationship between senescence marker p16INK4a expression in peripheral blood T-lymphocytes (p16 levels in PBTLs) with aging and with perioperative outcomes in older CAB patients. We included 55 patients age 55 and older, who underwent CAB in Johns Hopkins Hospital between September 1st, 2010 and March 25th, 2013. Demographic, clinical and laboratory data following outline of the Society of Thoracic Surgeons data collection form was collected, and p16 mRNA levels in PBTLs were measured using Taqman® qRT-PCR. Associations between p16 mRNA levels in PBTLs with length of hospital stay, frailty status, p16 protein levels in the aortic and left internal mammary artery tissue, cerebral oxygen saturation, and augmentation index as a measure of vascular stiffness were measured using regression analyses. Length of hospital stay was the primary outcome of interest, and major organ morbidity, mortality, and discharge to a skilled nursing facility were secondary outcomes. In secondary analysis, we evaluated associations between p16 mRNA levels in PBTLs and interleukin-6 levels using regression analyses. Median age of enrolled patients was 63.5 years (range 56-81 years), they were predominantly male (74.55%), of Caucasian descent (85.45%). Median log2(p16 levels in PBTLs) were 4.71 (range 1.10-6.82). P16 levels in PBTLs were significantly associated with chronological age (mean difference 0.06 for each year increase in age, 95% CI 0.01-0.11) and interleukin 6 levels (mean difference 0.09 for each pg/ml increase in IL-6 levels, 95% CI 0.01-0.18). There were no significant associations with frailty status, augmentation index, cerebral oxygenation and p16 protein levels in blood vessels. Increasing p16 levels in PBTLs did not predict length of stay in the hospital (HR 1.10, 95% CI 0.87-1.40) or intensive care unit (HR 1.02, 95% CI 0.79-1.32). Additional evaluation of p16 levels in PBTLs as predictor of perioperative outcomes is required and should include additional markers of immune system aging as well as different outcomes after CAB in addition to length of hospital stay

    Impact of Differential Attrition on the Association of Education With Cognitive Change Over 20 Years of Follow-up: The ARIC Neurocognitive Study

    Get PDF
    Studies of long-term cognitive change should account for the potential effects of education on the outcome, since some studies have demonstrated an association of education with dementia risk. Evaluating cognitive change is more ideal than evaluating cognitive performance at a single time point, because it should be less susceptible to confounding. In this analysis of 14,020 persons from a US cohort study, the Atherosclerosis Risk in Communities (ARIC) Study, we measured change in performance on 3 cognitive tests over a 20-year period, from ages 48–67 years (1990–1992) through ages 70–89 years (2011–2013). Generalized estimating equations were used to evaluate the association between education and cognitive change in unweighted adjusted models, in models incorporating inverse probability of attrition weighting, and in models using cognitive scores imputed from the Telephone Interview for Cognitive Status for participants not examined in person. Education did not have a strong relationship with change in cognitive test performance, although the rate of decline was somewhat slower among persons with lower levels of education. Methods used to account for selective dropout only marginally changed these observed associations. Future studies of risk factors for cognitive impairment should focus on cognitive change, when possible, to allow for reduction of confounding by social or cultural factors

    Characterizing Sleep Structure Using the Hypnogram

    No full text
    Objectives: Research on the effects of sleep-disordered breathing (SDB) on sleep structure has traditionally been based on composite sleep-stage summaries. The primary objective of this investigation was to demonstrate the utility of log-linear and multistate analysis of the sleep hypnogram in evaluating differences in nocturnal sleep structure in subjects with and without SDB. Methods: A community-based sample of middle-aged and older adults with and without SDB matched on age, sex, race, and body mass index was identified from the Sleep Heart Health Study. Sleep was assessed with home polysomnography and categorized into rapid eye movement (REM) and non-REM (NREM) sleep. Log-linear and multistate survival analysis models were used to quantify the frequency and hazard rates of transitioning, respectively, between wakefulness, NREM sleep, and REM sleep. Results: Whereas composite sleep-stage summaries were similar between the two groups, subjects with SDB had higher frequencies and hazard rates for transitioning between the three states. Specifically, log-linear models showed that subjects with SDB had more wake-to-NREM sleep and NREM sleep-to-wake transitions, compared with subjects without SDB. Multistate survival models revealed that subjects with SDB transitioned more quickly from wake-to-NREM sleep and NREM sleep-to-wake than did subjects without SDB. Conclusions: The description of sleep continuity with log-linear and multistate analysis of the sleep hypnogram suggests that such methods can identify differences in sleep structure that are not evident with conventional sleep-stage summaries. Detailed characterization of nocturnal sleep evolution with event history methods provides additional means for testing hypotheses on how specific conditions impact sleep continuity and whether sleep disruption is associated with adverse health outcomes

    The stability of DSM personality disorders over twelve to eighteen years

    Full text link
    Background: Stability of personality disorders is assumed in most nomenclatures; however, the evidence for this is limited and inconsistent. The aim of this study is to investigate the stability of DSM-III personality disorders in a community sample of eastern Baltimore residents unselected for treatment. Methods: Two hundred ninety four participants were examined on two occasions by psychiatrists using the same standardized examination twelve to eighteen years apart. All the DSM-III criteria for personality disorders were assessed. Item-response analysis was adapted into two approaches to assess the agreement between the personality measures on the two occasions. The first approach estimated stability in the underlying disorder, correcting for error in trait measurement, and the second approach estimated stability in the measured disorder, without correcting for item unreliability. Results: Five of the ten personality disorders exhibited moderate stability in individuals: antisocial, avoidant, borderline, histrionic, and schizotypal. Associated estimated ICCs for stability of underlying disorder over time ranged between approximately 0.4 and 0.7-0.8. A sixth disorder, OCPD, exhibited appreciable stability with estimated ICC of approximately 0.2-0.3. Dependent, narcissistic, paranoid, and schizoid disorders were not demonstrably stable. Conclusions: The findings suggest that six of the DSM personality disorder constructs themselves are stable, but that specific traits within the DSM categories are both of lesser importance than the constructs themselves and require additional specification

    Dry eye and dry mouth in the elderly: A population-based assessment

    No full text
    Background: Symptoms of dry eye and dry mouth are common in the elderly and are often debilitating. Previous research on small populations has been inconsistent regarding the contribution to sicca symptoms of autoimmune markers, medication use, and other factors. The objective of this study was to determine the population prevalence of symptoms of dry eye and dry mouth and to evaluate possible risk factors. Methods: This is a population-based study of 2481 individuals, aged 65 to 84 years, residing in Salisbury, Md, and identified by the Health Care Financing Medicare database. The main outcome measures included information on sicca symptoms, medical history, medication use, and joint examination results collected in a standardized manner. Autoimmune status was assessed in 1200 individuals by measuring antinuclear antibody, rheumatoid factor, and autoantibodies to the soluble nuclear antigens Ro/SS-A and La/SS-B by double immunodiffusion. Results: Approximately 27% of the population reported dry eye or dry mouth symptoms to be present often or all the time and 4.4% reported both. The prevalence of dry mouth (but not dry eye) symptoms increased with age, female sex, and white race. No association of sicca symptoms was found with rheumatoid arthritis, smoking, alcohol consumption, reproductive hormonal status, or the presence of autoantibodies. A strong, dose-response relationship was observed between sicca symptoms and the use of certain medication classes. The proportion of the population prevalence of sicca symptoms attributable to the use of drying medications was estimated at 62% for dry eye and dry mouth and 38% for dry eye or dry mouth symptoms. Conclusions: Sicca symptoms are common in the elderly, and medication side effects appear to be a major underlying factor. Our results do not indicate an association between autoimmune status and sicca symptoms and do not support immunologic testing in persons with sicca symptoms in the absence of other important systemic features

    Effect of intravenous iron use on hospitalizations in patients undergoing hemodialysis: a comparative effectiveness analysis from the DEcIDE-ESRD study

    No full text
    Intravenous iron use in hemodialysis patients has greatly increased over the last decade, despite limited studies on the safety of iron. We studied the association of receipt of intravenous iron with hospitalizations in an incident cohort of hemodialysis patients. We examined 9544 patients from Dialysis Clinic, Inc. (DCI). We ascertained intravenous iron use from DCI electronic medical record and USRDS data files, and hospitalizations through Medicare claims. We examined the association between iron exposure accumulated over 1-, 3- or 6-month time windows and incident hospitalizations in the follow-up period using marginal structural models accounting for time-dependent confounders. We performed sensitivity analyses including recurrent events models for multiple hospitalizations and models for combined outcome of hospitalization and death. There were 22 347 hospitalizations during a median follow-up of 23 months. Higher cumulative dose of intravenous iron was not associated with all-cause, cardiovascular or infectious hospitalizations [HR 0.97 (95% CI: 0.77-1.22) for all-cause hospitalizations comparing >2100 mg versus 0-900 mg of iron over 6 months]. Findings were similar in models examining the risk of hospitalizations in 1- and 3-month windows [HR 0.88 (95% CI: 0.79-0.99) and HR 0.88 (95% CI: 0.74-1.03), respectively] or the risk of combined outcome of hospitalization and death in the 6-month window [HR 0.98 (95% CI: 0.78-1.23)]. Higher cumulative dose of intravenous iron may not be associated with increased risk of hospitalizations in hemodialysis patients. While clinical trials are needed, employing higher iron doses to reduce erythropoiesis-stimulating agents does not appear to increase morbidity in routine clinical car

    An instrumental variable approach finds no associated harm or benefit with early dialysis initiation in the United States

    No full text
    The estimated glomerular filtration rate (eGFR) at dialysis initiation has been rising. Observational studies suggest harm, but may be confounded by unmeasured factors. As instrumental variable methods may be less biased, we performed a retrospective cohort study of 310,932 patients who started dialysis between 2006 and 2008 and were registered in the United States Renal Data System in order to describe geographic variation in eGFR at dialysis initiation and determine its association with mortality. Patients were grouped into 804 health service areas (HSAs) by zip code. Individual eGFR at dialysis initiation averaged 10.8 ml/min per 1.73 m(2) but varied geographically. Only 11% of the variation in mean HSA-level eGFR at dialysis initiation was accounted for by patient characteristics. We calculated demographic-adjusted mean eGFR at dialysis initiation in the HSAs using the 2006 and 2007 incident cohort as our instrument and estimated the association between individual eGFR at dialysis initiation and mortality in the 2008 incident cohort using the two-stage residual inclusion method. Among 89,547 patients starting dialysis in 2008 with eGFR 5-20 ml/min per 1.73 m(2), eGFR at initiation was not associated with mortality over a median of 15.5 months (hazard ratio, 1.025 per 1 ml/min per 1.73 m(2) for eGFR 5-14 ml/min per 1.73 m(2); and 0.973 per 1 ml/min per 1.73 m(2) for eGFR 14-20 ml/min per 1.73 m(2)). Thus, there was no associated harm or benefit with early dialysis initiation in the United State
    corecore