9 research outputs found
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose:
Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom.
Methods:
Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded.
Results:
The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia.
Conclusion:
We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes
Short-segment minimally disruptive anterior column release for focal sagittal deformity correction of the thoracolumbar spine
Background: Sagittal malalignment is associated with poor quality of life. Correction of lumbar lordosis through anterior column release (ACR) has been shown to improve overall sagittal alignment, however typically in combination with long posterior constructs and associated morbidity. The technical feasibility and radiographic outcomes of short-segment anterior or lateral minimally invasive surgery (MIS) ACR techniques in moderate to severe lumbar sagittal deformity were evaluated. Methods: Consecutive patients treated with short-segment MIS ACR techniques for moderate to severe lumbar sagittal deformity correction were retrospectively analyzed from a prospectively collected database. Clinical outcomes included perioperative measures of invasiveness, including operative time, blood loss, complications, and average length of stay. Radiographic outcomes included measurement of preoperative, immediate postoperative, and long-term follow-up radiographic parameters including coronal Cobb angle, lumbar lordosis (LL), pelvic incidence (PI), PI-LL mismatch, pelvic tilt (PT), T1 pelvic angle (TPA), T1 spino-pelvic inclination (T1SPI), proximal junctional angle (PJA), and sagittal vertical axis (SVA). Results: The cohort included 34 patients (mean age 63) who were treated at an average 2.5 interbody levels (range 1-4) through a lateral or anterior approach (LLIF or ALIF). Of 89 total interbody levels treated, 63 (71%) were ACR levels. Posterior fixation was across an average of 3.2 levels (range 1-5). Mean total operative time and blood loss were 362 minutes and 621 mL. Surgical complications occurred in 2 (5.9%). Average hospital stay was 5.5 days (including staging). At last follow-up (average 25.4 months; range 0.5-7 years), all patients (100%) demonstrated successful achievement of one or more alignment goal, with significant improvements in coronal Cobb, LL, PI-LL mismatch, PT, and TPA. No patient was revised for PJK. Conclusions: These data show that short-segment MIS ACR correction of moderate to severe lumbar sagittal deformity is feasible and effective at achieving overall alignment goals with low procedural morbidity and risk of proximal junctional issues
Recommendations for culturally safe clinical kidney care for First Nations Australians: a guideline summary
Introduction: First Nations Australians display remarkable strength and resilience despite the intergenerational impacts of ongoing colonisation. The continuing disadvantage is evident in the higher incidence, prevalence, morbidity and mortality of chronic kidney disease (CKD) among First Nations Australians. Nationwide community consultation (Kidney Health Australia, Yarning Kidneys, and Lowitja Institute, Catching Some Air) identified priority issues for guideline development. These guidelines uniquely prioritised the knowledge of the community, alongside relevant evidence using an adapted GRADE Evidence to Decision framework to develop specific recommendations for the management of CKD among First Nations Australians. Main recommendations: These guidelines explicitly state that health systems have to measure, monitor and evaluate institutional racism and link it to cultural safety training, as well as increase community and family involvement in clinical care and equitable transport and accommodation. The guidelines recommend earlier CKD screening criteria (age ≥ 18 years) and referral to specialists services with earlier criteria of kidney function (eg, estimated glomerular filtration rate [eGFR], ≤ 45 mL/min/1.73 m2, and a sustained decrease in eGFR, \u3e 10 mL/min/1.73 m2 per year) compared with the general population. Changes in management as result of the guidelines: Our recommendations prioritise health care service delivery changes to address institutional racism and ensure meaningful cultural safety training. Earlier detection of CKD and referral to nephrologists for First Nations Australians has been recommended to ensure timely implementation to preserve kidney function given the excess burden of disease. Finally, the importance of community with the recognition of involvement in all aspects and stages of treatment together with increased access to care on Country, particularly in rural and remote locations, including dialysis services
Recommended from our members
Time to Peak Glucose and Peak C-Peptide During the Progression to Type 1 Diabetes in the Diabetes Prevention Trial and TrialNet Cohorts
OBJECTIVE To assess the progression of type 1 diabetes using time to peak glucose or C-peptide during oral glucose tolerance tests (OGTTs) in autoantibody-positive relatives of people with type 1 diabetes. RESEARCH DESIGN AND METHODS We examined 2-h OGTTs of participants in the Diabetes Prevention Trial Type 1 (DPT-1) and TrialNet Pathway to Prevention (PTP) studies. We included 706 DPT-1 participants (mean ± SD age, 13.84 ± 9.53 years; BMI Z-score, 0.33 ± 1.07; 56.1% male) and 3,720 PTP participants (age, 16.01 ± 12.33 years; BMI Z-score, 0.66 ± 1.3; 49.7% male). Log-rank testing and Cox regression analyses with adjustments (age, sex, race, BMI Z-score, HOMA-insulin resistance, and peak glucose/C-peptide levels, respectively) were performed. RESULTS In each of DPT-1 and PTP, higher 5-year diabetes progression risk was seen in those with time to peak glucose >30 min and time to peak C-peptide >60 min (P < 0.001 for all groups), before and after adjustments. In models examining strength of association with diabetes development, associations were greater for time to peak C-peptide versus peak C-peptide value (DPT-1: χ2 = 25.76 vs. χ2 = 8.62; PTP: χ2 = 149.19 vs. χ2 = 79.98; all P < 0.001). Changes in the percentage of individuals with delayed glucose and/or C-peptide peaks were noted over time. CONCLUSIONS In two independent at-risk populations, we show that those with delayed OGTT peak times for glucose or C-peptide are at higher risk of diabetes development within 5 years, independent of peak levels. Moreover, time to peak C-peptide appears more predictive than the peak level, suggesting its potential use as a specific biomarker for diabetes progression