61 research outputs found

    Quinine exposure and the risk of acute kidney injury:a population based observational study of older people

    Get PDF
    Objectives to establish and quantify any observable association between the exposure to community prescriptions for quinine and acute kidney injury (AKI) events in a population of older adults. Design two observational studies using the same dataset, a retrospective longitudinal cohort study and a self-controlled case series (SCCS). Setting NHS health board in Scotland. Participants older adults (60+ years) who received quinine prescriptions in Tayside, Scotland, between January 2004 and December 2015. The first study included 12,744 individuals. The SCCS cohort included 5,907 people with quinine exposure and more than or equal to one AKI event. Main outcome measured in the first study, multivariable logistic regression was used to calculate odds ratios (ORs) for AKI comparing between episodes with and without recent quinine exposure after adjustment for demographics, comorbidities and concomitant medications. The SCCS study divided follow-up for each individual into periods ‘on’ and ‘off’ quinine, calculating incidence rate ratios (IRRs) for AKI adjusting for age. Results during the study period, 273,596 prescriptions for quinine were dispensed in Tayside. A total of 13,616 AKI events occurred during follow-up (crude incidence 12.5 per 100 person-years). In the first study, exposure to quinine before an episode of care was significantly associated with an increased probability of AKI (adjusted OR = 1.27, 95% confidence interval (CI) 1.21–1.33). In the SCCS study, exposure to quinine was associated with an increased relative incidence of AKI compared to unexposed periods (IRR = 1.20, 95% CI 1.15–1.26), with the greatest risk observed within 30 days following quinine initiation (IRR = 1.48, 95% CI 1.35–1.61). Conclusion community prescriptions for quinine in an older adult population are associated with an increased risk of AKI

    The Relationship between AKI and CKD in Patients with Type 2 Diabetes:An Observational Cohort Study

    Get PDF
    Background There are few observational studies evaluating the risk of AKI in people with type 2 diabetes, and even fewer simultaneously investigating AKI and CKD in this population. This limits understanding of the interplay between AKI and CKD in people with type 2 diabetes compared with the nondiabetic population. Methods In this retrospective, cohort study of participants with or without type 2 diabetes, we used electronic healthcare records to evaluate rates of AKI and various statistical methods to determine their relationship to CKD status and further renal function decline. Results We followed the cohort of 16,700 participants (9417 with type 2 diabetes and 7283 controls without diabetes) for a median of 8.2 years. Those with diabetes were more likely than controls to develop AKI (48.6% versus 17.2%, respectively) and have preexisting CKD or CKD that developed during follow-up (46.3% versus 17.2%, respectively). In the absence of CKD, the AKI rate among people with diabetes was nearly five times that of controls (121.5 versus 24.6 per 1000 person-years). Among participants with CKD, AKI rate in people with diabetes was more than twice that of controls (384.8 versus 180.0 per 1000 person-years after CKD diagnostic date, and 109.3 versus 47.4 per 1000 person-years before CKD onset in those developing CKD after recruitment). Decline in eGFR slope before AKI episodes was steeper in people with diabetes versus controls. After AKI episodes, decline in eGFR slope became steeper in people without diabetes, but not among those with diabetes and preexisting CKD. Conclusions Patients with diabetes have significantly higher rates of AKI compared with patients without diabetes, and this remains true for individuals with preexisting CKD.on behalf of the BEAt-DKD Consortiu

    Subcellular Epithelial HMGB1 Expression Is Associated with Colorectal Neoplastic Progression, Male Sex, Mismatch Repair Protein Expression, Lymph Node Positivity, and an 'Immune Cold' Phenotype Associated with Poor Survival.

    Get PDF
    New treatment targets are needed for colorectal cancer (CRC). We define expression of High Mobility Group Box 1 (HMGB1) protein throughout colorectal neoplastic progression and examine the biological consequences of aberrant expression. HMGB1 is a ubiquitously expressed nuclear protein that shuttles to the cytoplasm under cellular stress. HMGB1 impacts cellular responses, acting as a cytokine when secreted. A total of 846 human tissue samples were retrieved; 6242 immunohistochemically stained sections were reviewed. Subcellular epithelial HMGB1 expression was assessed in a CRC Tissue Microarray (n = 650), normal colonic epithelium (n = 75), adenomatous polyps (n = 52), and CRC polyps (CaP, n = 69). Stromal lymphocyte phenotype was assessed in the CRC microarray and a subgroup of CaP. Normal colonic epithelium has strong nuclear and absent cytoplasmic HMGB1. With progression to CRC, there is an emergence of strong cytoplasmic HMGB1 (p < 0.001), pronounced at the leading cancer edge within CaP (p < 0.001), and reduction in nuclear HMGB1 (p < 0.001). In CRC, absent nuclear HMGB1 is associated with mismatch repair proteins (p = 0.001). Stronger cytoplasmic HMGB1 is associated with lymph node positivity (p < 0.001) and male sex (p = 0.009). Stronger nuclear (p = 0.011) and cytoplasmic (p = 0.002) HMGB1 is associated with greater CD4+ T-cell density, stronger nuclear HMGB1 is associated with greater FOXP3+ (p < 0.001) and ICOS+ (p = 0.018) lymphocyte density, and stronger nuclear HMGB1 is associated with reduced CD8+ T-cell density (p = 0.022). HMGB1 does not directly impact survival but is associated with an 'immune cold' tumour microenvironment which is associated with poor survival (p < 0.001). HMGB1 may represent a new treatment target for CRC

    Understanding health-care outcomes of older people with cognitive impairment and/or dementia admitted to hospital: a mixed-methods study

    Get PDF
    BACKGROUND: Cognitive impairment is common in older people admitted to hospital, but previous research has focused on single conditions. OBJECTIVE: This project sits in phase 0/1 of the Medical Research Council Framework for the Development and Evaluation of Complex Interventions. It aims to develop an understanding of current health-care outcomes. This will be used in the future development of a multidomain intervention for people with confusion (dementia and cognitive impairment) in general hospitals. The research was conducted from January 2015 to June 2018 and used data from people admitted between 2012 and 2013. DESIGN: For the review of outcomes, the systematic review identified peer-reviewed quantitative epidemiology measuring prevalence and associations with outcomes. Screening for duplication and relevance was followed by full-text review, quality assessment and a narrative review (141 papers). A survey sought opinion on the key outcomes for people with dementia and/or confusion and their carers in the acute hospital (n = 78). For the analysis of outcomes including cost, the prospective cohort study was in a medical admissions unit in an acute hospital in one Scottish health board covering 10% of the Scottish population. The participants (n = 6724) were older people (aged ≥ 65 years) with or without a cognitive spectrum disorder who were admitted as medical emergencies between January 2012 and December 2013 and who underwent a structured nurse assessment. ‘Cognitive spectrum disorder’ was defined as any combination of delirium, known dementia or an Abbreviated Mental Test score of < 8 out of 10 points. The main outcome measures were living at home 30 days after discharge, mortality within 2 years of admission, length of stay, re-admission within 2 years of admission and cost. DATA SOURCES: Scottish Morbidity Records 01 was linked to the Older Persons Routine Acute Assessment data set. RESULTS: In the systematic review, methodological heterogeneity, especially concerning diagnostic criteria, means that there is significant overlap in conditions of patients presenting to general hospitals with confusion. Patients and their families expect that patients are discharged in the same or a better condition than they were in on admission or, failing that, that they have a satisfactory experience of their admission. Cognitive spectrum disorders were present in more than one-third of patients aged ≥ 65 years, and in over half of those aged ≥ 85 years. Outcomes were worse in those patients with cognitive spectrum disorders than in those without: length of stay 25.0 vs. 11.8 days, 30-day mortality 13.6% vs. 9.0%, 1-year mortality 40.0% vs. 26.0%, 1-year mortality or re-admission 62.4% vs. 51.5%, respectively (all p < 0.01). There was relatively little difference by cognitive spectrum disorder type; for example, the presence of any cognitive spectrum disorder was associated with an increased mortality over the entire period of follow-up, but with different temporal patterns depending on the type of cognitive spectrum disorder. The cost of admission was higher for those with cognitive spectrum disorders, but the average daily cost was lower. LIMITATIONS: A lack of diagnosis and/or standardisation of diagnosis for dementia and/or delirium was a limitation for the systematic review, the quantitative study and the economic study. The economic study was limited to in-hospital costs as data for social or informal care costs were unavailable. The survey was conducted online, limiting its reach to older carers and those people with cognitive spectrum disorders. CONCLUSIONS: Cognitive spectrum disorders are common in older inpatients and are associated with considerably worse health-care outcomes, with significant overlap between individual cognitive spectrum disorders. This suggests the need for health-care systems to systematically identify and develop care pathways for older people with cognitive spectrum disorders, and avoid focusing on only condition-specific pathways. FUTURE WORKS: Development and evaluation of a multidomain intervention for the management of patients with cognitive spectrum disorders in hospital. STUDY REGISTRATION: This study is registered as PROSPERO CRD42015024492. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Services and Delivery Research programme and will be published in full in Health Services and Delivery Research; Vol. 9, No. 8. See the NIHR Journals Library website for further project information

    Competing risks analysis for neutrophil to lymphocyte ratio as a predictor of diabetic retinopathy incidence in the Scottish population

    Get PDF
    Background: Diabetic retinopathy (DR) is a major sight-threatening microvascular complication in individuals with diabetes. Systemic inflammation combined with oxidative stress is thought to capture most of the complexities involved in the pathology of diabetic retinopathy. A high level of neutrophil–lymphocyte ratio (NLR) is an indicator of abnormal immune system activity. Current estimates of the association of NLR with diabetes and its complications are almost entirely derived from cross-sectional studies, suggesting that the nature of the reported association may be more diagnostic than prognostic. Therefore, in the present study, we examined the utility of NLR as a biomarker to predict the incidence of DR in the Scottish population.Methods: The incidence of DR was defined as the time to the first diagnosis of R1 or above grade in the Scottish retinopathy grading scheme from type 2 diabetes diagnosis. The effect of NLR and its interactions were explored using a competing risks survival model adjusting for other risk factors and accounting for deaths. The Fine and Gray subdistribution hazard model (FGR) was used to predict the effect of NLR on the incidence of DR.Results: We analysed data from 23,531 individuals with complete covariate information. At 10 years, 8416 (35.8%) had developed DR and 2989 (12.7%) were lost to competing events (death) without developing DR and 12,126 individuals did not have DR. The median (interquartile range) level of NLR was 2.04 (1.5 to 2.7). The optimal NLR cut-off value to predict retinopathy incidence was 3.04. After accounting for competing risks at 10 years, the cumulative incidence of DR and deaths without DR were 50.7% and 21.9%, respectively. NLR was associated with incident DR in both Cause-specific hazard (CSH = 1.63; 95% CI: 1.28–2.07) and FGR models the subdistribution hazard (sHR = 2.24; 95% CI: 1.70–2.94). Both age and HbA 1c were found to modulate the association between NLR and the risk of DR.Conclusions: The current study suggests that NLR has a promising potential to predict DR incidence in the Scottish population, especially in individuals less than 65 years and in those with well-controlled glycaemic status.</p

    Living at home after emergency hospital admission:prospective cohort study in older adults with and without cognitive spectrum disorder

    Get PDF
    Background: Cognitive spectrum disorders (CSDs) are common in hospitalised older adults and associated with adverse outcomes. Their association with the maintenance of independent living has not been established. The aim was to establish the role of CSDs on the likelihood of living at home 30 days after discharge or being newly admitted to a care home. Methods: A prospective cohort study with routine data linkage was conducted based on admissions data from the acute medical unit of a district general hospital in Scotland. 5570 people aged ≥ 65 years admitted from a private residence who survived to discharge and received the Older Persons Routine Acute Assessment (OPRAA) during an incident emergency medical admission were included. The outcome measures were living at home, defined as a private residential address, 30 days after discharge and new care home admission at hospital discharge. Outcomes were ascertained through linkage to routine data sources. Results: Of the 5570 individuals admitted from a private residence who survived to discharge, those without a CSD were more likely to be living at home at 30 days than those with a CSD (93.4% versus 81.7%; difference 11.7%, 95%CI 9.7–13.8%). New discharge to a care home affected 236 (4.2%) of the cohort, 181 (76.7%) of whom had a CSD. Logistic regression modelling identified that all four CSD categories were associated with a reduced likelihood of living at home and an increased likelihood of discharge to a care home. Those with delirium superimposed on dementia were the least likely to be living at home (OR 0.25), followed by those with dementia (OR 0.43), then unspecified cognitive impairment (OR 0.55) and finally delirium (OR 0.57). Conclusions: Individuals with a CSD are at significantly increased risk of not returning home after hospitalisation, and those with CSDs account for the majority of new admissions to care homes on discharge. Individuals with delirium superimposed on dementia are the most affected. We need to understand how to configure and deliver healthcare services to enable older people to remain as independent as possible for as long as possible and to ensure transitions of care are managed supportively

    Epidemiology and outcomes of people with dementia, delirium and unspecified cognitive impairment in the general hospital: prospective cohort study of 10,014 admissions

    Get PDF
    Background&nbsp; Cognitive impairment of various kinds is common in older people admitted to hospital, but previous research has usually focused on single conditions in highly-selected groups and has rarely examined associations with outcomes. This study examined prevalence and outcomes of cognitive impairment in a large unselected cohort of people aged 65+ with an emergency medical admission.&nbsp; Methods&nbsp; Between January 1, 2012, and June 30, 2013, admissions to a single general hospital acute medical unit aged 65+ underwent a structured specialist nurse assessment (n = 10,014). We defined &lsquo;cognitive spectrum disorder&rsquo; (CSD) as any combination of delirium, known dementia, or Abbreviated Mental Test (AMT) score &lt; 8/10. Routine data for length of stay (LOS), mortality, and readmission were linked to examine associations with outcomes.&nbsp; Results&nbsp; A CSD was present in 38.5% of all patients admitted aged over 65, and in more than half of those aged over 85. Overall, 16.7% of older people admitted had delirium alone, 7.9% delirium superimposed on known dementia, 9.4% known dementia alone, and 4.5% unspecified cognitive impairment (AMT score &lt; 8/10, no delirium, no known dementia). Of those with known dementia, 45.8% had delirium superimposed. Outcomes were worse in those with CSD compared to those without &ndash; LOS 25.0 vs. 11.8 days, 30-day mortality 13.6% vs. 9.0%, 1-year mortality 40.0% vs. 26.0%, 1-year death or readmission 62.4% vs. 51.5% (allP &lt; 0.01). There was relatively little difference by CSD type, although people with delirium superimposed on dementia had the longest LOS, and people with dementia the worst mortality at 1 year.&nbsp; Conclusions&nbsp; CSD is common in older inpatients and associated with considerably worse outcomes, with little variation between different types of CSD. Healthcare systems should systematically identify and develop care pathways for older people with CSD admitted as medical emergencies, and avoid only focusing on condition-specific pathways such as those for dementia or delirium alone

    Analysis of physical pore space characteristics of two pyrolytic biochars and potential as microhabitat

    Get PDF
    Background and Aims Biochar amendment to soil is a promising practice of enhancing productivity of agricultural systems. The positive effects on crop are often attributed to a promotion of beneficial soil microorganisms while suppressing pathogens e.g. This study aims to determine the influence of biochar feedstock on (i) spontaneous and fungi inoculated microbial colonisation of biochar particles and (ii) physical pore space characteristics of native and fungi colonised biochar particles which impact microbial habitat quality. Methods Pyrolytic biochars from mixed woods and Miscanthus were investigated towards spontaneous colonisation by classical microbiological isolation, phylogenetic identification of bacterial and fungal strains, and microbial respiration analysis. Physical pore space characteristics of biochar particles were determined by X-ray μ-CT. Subsequent 3D image analysis included porosity, surface area, connectivities, and pore size distribution. Results Microorganisms isolated from Wood biochar were more abundant and proliferated faster than those from the Miscanthus biochar. All isolated bacteria belonged to gram-positive bacteria and were feedstock specific. Respiration analysis revealed higher microbial activity for Wood biochar after water and substrate amendment while basal respiration was on the same low level for both biochars. Differences in porosity and physical surface area were detected only in interaction with biochar-specific colonisation. Miscanthus biochar was shown to have higher connectivity values in surface, volume and transmission than Wood biochars as well as larger pores as observed by pore size distribution. Differences in physical properties between colonised and non-colonised particles were larger in Miscanthus biochar than in Wood biochar. Conclusions Vigorous colonisation was found on Wood biochar compared to Miscanthus biochar. This is contrasted by our findings from physical pore space analysis which suggests better habitat quality in Miscanthus biochar than in Wood biochar. We conclude that (i) the selected feedstocks display large differences in microbial habitat quality as well as physical pore space characteristics and (ii) physical description of biochars alone does not suffice for the reliable prediction of microbial habitat quality and recommend that physical and surface chemical data should be linked for this purpose
    • …
    corecore