132 research outputs found

    High risk prescribing in older adults: Prevalence, clinical and economic implications and potential for intervention at the population level

    Get PDF
    Background: High risk prescribing can compromise independent wellbeing and quality of life in older adults. The aims of this project are to determine the prevalence, risk factors, clinical consequences, and costs of high risk prescribing, and to assess the impact of interventions on high risk prescribing in older people. Methods. The proposed project will utilise data from the 45 and Up Study, a large scale cohort of 267,153 men and women aged 45 and over recruited during 2006-2009 from the state of New South Wales, Australia linked to a range of administrative health datasets. High risk prescribing will be assessed using three indicators: polypharmacy (use of five or more medicines); Beers Criteria (an explicit measure of potentially inappropriate medication use); and Drug Burden Index (a pharmacologic dose-dependent measure of cumulative exposure to anticholinergic and sedative medicines). Individual risk factors from the 45 and Up Study questionnaire, and health system characteristics from health datasets that are associated with the likelihood of high risk prescribing will be identified. The main outcome measures will include hospitalisation (first admission to hospital, total days in hospital, cause-specific hospitalisation); admission to institutionalised care; all-cause mortality, and, where possible, cause-specific mortality. Economic costs to the health care system and implications of high risk prescribing will be also investigated. In addition, changes in high risk prescribing will be evaluated in relation to certain routine medicines-related interventions. The statistical analysis will be conducted using standard pharmaco-epidemiological methods including descriptive analysis, univariate and multivariate regression analysis, controlling for relevant confounding factors, using a number of different approaches. Discussion. The availability of large-scale data is useful to identify opportunities for improving prescribing, and health in older adults. The size of the 45 and Up Study, along with linkage to health databases provides an important opportunity to investigate the relationship between high risk prescribing and adverse outcomes in a real-world population of older adults. © 2013 Gnjidic et al.; licensee BioMed Central Ltd

    Chewing function, general health and the dentition of older Australian men: The Concord Health and Ageing in Men Project

    Get PDF
    Objectives To describe the associations between chewing function with oral health and certain general health characteristics, in a population of community‐dwelling older Australian men. Methods Analysis of data obtained from a cross‐sectional analysis of the 4th wave of the Concord Health and Ageing in Men Project cohort of 614 participants, 524 whom were dentate, aged 78 years and over. Their chewing capacity was assessed using three main indicators: capacity to chew eleven food items ranging from boiled eggs through to fresh carrots and nuts; discomfort when eating; and interruption of meals. Associations with chewing were tested for dentate vs edentate participants, numbers of teeth present, active dental disease and key general health conditions such as disabilities, comorbidities and cognitive status. Log binomial regression models adjusted for age, country of birth, income, education and marital status. Prevalence ratios and 95% confidence intervals were estimated. Results Twenty‐one per cent of participants could not eat hard foods, while 23.1% reported discomfort when eating, and 8.8% reported interrupted meals when eating. There was a threefold difference in the capacity of dentate men to chew firm meat over that of edentulous men (95% CI, 2.0‐4.9); a 2.5 times greater likelihood of edentate men reporting discomfort when eating (95% CI: 1.5‐4.3); and 1.9 times greater likelihood of edentate participants reporting having meals interrupted (95% CI: 1.4‐2.6). Chewing/eating difficulties were associated with both dental status (number of teeth, active dental caries) and self‐rated dental health. Fewer than 20 teeth and the presence of active coronal or root decay were associated with more discomfort when eating. General health conditions associated with chewing function included disability, physical activity, comorbidities, cognitive status and depression. Older men's self‐rated oral health and general health perceptions were also associated with aspects of chewing function. Poorer self‐reported oral health was associated with inability to eat hard foods (95% CI: 1.3‐2.7) and with discomfort when eating (95% CI: 2.6‐5.1), while poorer self‐reported general health was associated with discomfort when eating (95% CI: 1.2‐2.2). Conclusions Falling rates of edentulism may lead to improved chewing and eating function in older men. Maintaining 20 or more natural teeth, and preventing active coronal and root caries should enhance chewing function and promote self‐reported health and oral health. Lower capacity to chew hard foods and a higher reporting of discomfort when eating is associated with co‐morbidity in older Australian men

    Hemorrhage-Adjusted Iron Requirements, Hematinics and Hepcidin Define Hereditary Hemorrhagic Telangiectasia as a Model of Hemorrhagic Iron Deficiency

    Get PDF
    BACKGROUND: Iron deficiency anemia remains a major global health problem. Higher iron demands provide the potential for a targeted preventative approach before anemia develops. The primary study objective was to develop and validate a metric that stratifies recommended dietary iron intake to compensate for patient-specific non-menstrual hemorrhagic losses. The secondary objective was to examine whether iron deficiency can be attributed to under-replacement of epistaxis (nosebleed) hemorrhagic iron losses in hereditary hemorrhagic telangiectasia (HHT). METHODOLOGY/PRINCIPAL FINDINGS: The hemorrhage adjusted iron requirement (HAIR) sums the recommended dietary allowance, and iron required to replace additional quantified hemorrhagic losses, based on the pre-menopausal increment to compensate for menstrual losses (formula provided). In a study population of 50 HHT patients completing concurrent dietary and nosebleed questionnaires, 43/50 (86%) met their recommended dietary allowance, but only 10/50 (20%) met their HAIR. Higher HAIR was a powerful predictor of lower hemoglobin (p = 0.009), lower mean corpuscular hemoglobin content (p<0.001), lower log-transformed serum iron (p = 0.009), and higher log-transformed red cell distribution width (p<0.001). There was no evidence of generalised abnormalities in iron handling Ferritin and ferritin(2) explained 60% of the hepcidin variance (p<0.001), and the mean hepcidinferritin ratio was similar to reported controls. Iron supplement use increased the proportion of individuals meeting their HAIR, and blunted associations between HAIR and hematinic indices. Once adjusted for supplement use however, reciprocal relationships between HAIR and hemoglobin/serum iron persisted. Of 568 individuals using iron tablets, most reported problems completing the course. For patients with hereditary hemorrhagic telangiectasia, persistent anemia was reported three-times more frequently if iron tablets caused diarrhea or needed to be stopped. CONCLUSIONS/SIGNIFICANCE: HAIR values, providing an indication of individuals' iron requirements, may be a useful tool in prevention, assessment and management of iron deficiency. Iron deficiency in HHT can be explained by under-replacement of nosebleed hemorrhagic iron losses

    Iron Accumulation with Age, Oxidative Stress and Functional Decline

    Get PDF
    Identification of biological mediators in sarcopenia is pertinent to the development of targeted interventions to alleviate this condition. Iron is recognized as a potent pro-oxidant and a catalyst for the formation of reactive oxygen species in biological systems. It is well accepted that iron accumulates with senescence in several organs, but little is known about iron accumulation in muscle and how it may affect muscle function. In addition, it is unclear if interventions which reduced age-related loss of muscle quality, such as calorie restriction, impact iron accumulation. We investigated non-heme iron concentration, oxidative stress to nucleic acids in gastrocnemius muscle and key indices of sarcopenia (muscle mass and grip strength) in male Fischer 344 X Brown Norway rats fed ad libitum (AL) or a calorie restricted diet (60% of ad libitum food intake starting at 4 months of age) at 8, 18, 29 and 37 months of age. Total non-heme iron levels in the gastrocnemius muscle of AL rats increased progressively with age. Between 29 and 37 months of age, the non-heme iron concentration increased by approximately 200% in AL-fed rats. Most importantly, the levels of oxidized RNA in gastrocnemius muscle of AL rats were significantly increased as well. The striking age-associated increase in non-heme iron and oxidized RNA levels and decrease in sarcopenia indices were all attenuated in the calorie restriction (CR) rats. These findings strongly suggest that the age-related iron accumulation in muscle contributes to increased oxidative damage and sarcopenia, and that CR effectively attenuates these negative effects

    Evidence of accelerated ageing in clinical drug addiction from immune, hepatic and metabolic biomarkers

    Get PDF
    Background: Drug addiction is associated with significant disease and death, but its impact on the ageing process has not been considered. The recent demonstration that many of the items available in routine clinical pathology have applicability as biomarkers of the ageing process implies that routine clinical laboratory parameters would be useful as an initial investigation of this possibility. Methods: 12,093 clinical laboratory results 1995-2006 were reviewed. To make the age ranges of the medical and addicted groups comparable the age range was restricted to 15-45 years. Results: 739 drug addicted (DA) and 5834 general medical (GM) age matched blood samples were compared. Significant elevation of immune parameters was noted in the C-reactive protein, erythrocyte sedimentation rate, total lymphocyte count, serum globulins and the globulin:albumin ratio (P < 0.01). Alanine aminotranferase, creatinine, urea, and insulin like growth factor-1 were also significantly higher (P < 0.01) in the DA group. Albumin, body mass index and dihydroepiandrosterone sulphate were unchanged and cholesterol was lower (all P < 0.05). Conclusion: These data demonstrate for the first time that addiction is associated with an altered profile of common biomarkers of ageing raising the possibility that the ageing process may be altered in this group. Infective and immune processes may be centrally involved. They suggest that addiction forms an interesting model to further examine the contribution of immune suppression and hyperstimulation to the ageing process

    A stable pattern of EEG spectral coherence distinguishes children with autism from neuro-typical controls - a large case control study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The autism rate has recently increased to 1 in 100 children. Genetic studies demonstrate poorly understood complexity. Environmental factors apparently also play a role. Magnetic resonance imaging (MRI) studies demonstrate increased brain sizes and altered connectivity. Electroencephalogram (EEG) coherence studies confirm connectivity changes. However, genetic-, MRI- and/or EEG-based diagnostic tests are not yet available. The varied study results likely reflect methodological and population differences, small samples and, for EEG, lack of attention to group-specific artifact.</p> <p>Methods</p> <p>Of the 1,304 subjects who participated in this study, with ages ranging from 1 to 18 years old and assessed with comparable EEG studies, 463 children were diagnosed with autism spectrum disorder (ASD); 571 children were neuro-typical controls (C). After artifact management, principal components analysis (PCA) identified EEG spectral coherence factors with corresponding loading patterns. The 2- to 12-year-old subsample consisted of 430 ASD- and 554 C-group subjects (n = 984). Discriminant function analysis (DFA) determined the spectral coherence factors' discrimination success for the two groups. Loading patterns on the DFA-selected coherence factors described ASD-specific coherence differences when compared to controls.</p> <p>Results</p> <p>Total sample PCA of coherence data identified 40 factors which explained 50.8% of the total population variance. For the 2- to 12-year-olds, the 40 factors showed highly significant group differences (<it>P </it>< 0.0001). Ten randomly generated split half replications demonstrated high-average classification success (C, 88.5%; ASD, 86.0%). Still higher success was obtained in the more restricted age sub-samples using the jackknifing technique: 2- to 4-year-olds (C, 90.6%; ASD, 98.1%); 4- to 6-year-olds (C, 90.9%; ASD 99.1%); and 6- to 12-year-olds (C, 98.7%; ASD, 93.9%). Coherence loadings demonstrated reduced short-distance and reduced, as well as increased, long-distance coherences for the ASD-groups, when compared to the controls. Average spectral loading per factor was wide (10.1 Hz).</p> <p>Conclusions</p> <p>Classification success suggests a stable coherence loading pattern that differentiates ASD- from C-group subjects. This might constitute an EEG coherence-based phenotype of childhood autism. The predominantly reduced short-distance coherences may indicate poor local network function. The increased long-distance coherences may represent compensatory processes or reduced neural pruning. The wide average spectral range of factor loadings may suggest over-damped neural networks.</p
    corecore