10 research outputs found

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors

    Get PDF
    Background: The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments. Methods: The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed. Findings: Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p<0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p<0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p>0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p<0·0001). Interpretation: During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores. Funding: NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    ICare-ACS (improving care processes for patients with suspected acute coronary syndrome): a study of cross-system implementation of a National Clinical Pathway

    No full text
    Background: Efforts to safely reduce length of stay for emergency department patients with symptoms suggestive of acute coronary syndrome (ACS) have had mixed success. Few system-wide efforts affecting multiple hospital emergency departments have ever been evaluated. We evaluated the effectiveness of a nationwide implementation of clinical pathways for potential ACS in disparate hospitals.Methods: This was a multicenter pragmatic stepped-wedge before-and-after trial in 7 New Zealand acute care hospitals with 31332 patients investigated for suspected ACS with serial troponin measurements. The implementation was a clinical pathway for the assessment of patients with suspected ACS that included a clinical pathway document in paper or electronic format, structured risk stratification, specified time points for electrocardiographic and serial troponin testing within 3 hours of arrival, and directions for combining risk stratification and electrocardiographic and troponin testing in an accelerated diagnostic protocol. Implementation was monitored for >4 months and compared with usual care over the preceding 6 months. The main outcome measure was the odds of discharge within 6 hours of presentationResults: There were 11529 participants in the preimplementation phase (range, 284-3465) and 19803 in the postimplementation phase (range, 395-5039). Overall, the mean 6-hour discharge rate increased from 8.3% (range, 2.7%-37.7%) to 18.4% (6.8%-43.8%). The odds of being discharged within 6 hours increased after clinical pathway implementation. The odds ratio was 2.4 (95% confidence interval, 2.3-2.6). In patients without ACS, the median length of hospital stays decreased by 2.9 hours (95% confidence interval, 2.4-3.4). For patients discharged within 6 hours, there was no change in 30-day major adverse cardiac event rates (0.52% versus 0.44%; P=0.96). In these patients, no adverse event occurred when clinical pathways were correctly followed.Conclusions: Implementation of clinical pathways for suspected ACS reduced the length of stay and increased the proportions of patients safely discharged within 6 hours.Clinical Trial Registration: URL: https://www.anzctr.org.au/ (Australian and New Zealand Clinical Trials Registry). Unique identifier: ACTRN12617000381381

    ICare-ACS (Improving Care Processes for Patients With Suspected Acute Coronary Syndrome): A study of cross-system implementation of a national clinical pathway

    No full text
    Background: Efforts to safely reduce length of stay for emergency department patients with symptoms suggestive of acute coronary syndrome (ACS) have had mixed success. Few system-wide efforts affecting multiple hospital emergency departments have ever been evaluated. We evaluated the effectiveness of a nationwide implementation of clinical pathways for potential ACS in disparate hospitals. Methods: This was a multicenter pragmatic stepped-wedge before-and-after trial in 7 New Zealand acute care hospitals with 31332 patients investigated for suspected ACS with serial troponin measurements. The implementation was a clinical pathway for the assessment of patients with suspected ACS that included a clinical pathway document in paper or electronic format, structured risk stratification, specified time points for electrocardiographic and serial troponin testing within 3 hours of arrival, and directions for combining risk stratification and electrocardiographic and troponin testing in an accelerated diagnostic protocol. Implementation was monitored for \u3e4 months and compared with usual care over the preceding 6 months. The main outcome measure was the odds of discharge within 6 hours of presentation Results: There were 11529 participants in the preimplementation phase (range, 284–3465) and 19803 in the postimplementation phase (range, 395–5039). Overall, the mean 6-hour discharge rate increased from 8.3% (range, 2.7%–37.7%) to 18.4% (6.8%–43.8%). The odds of being discharged within 6 hours increased after clinical pathway implementation. The odds ratio was 2.4 (95% confidence interval, 2.3–2.6). In patients without ACS, the median length of hospital stays decreased by 2.9 hours (95% confidence interval, 2.4–3.4). For patients discharged within 6 hours, there was no change in 30-day major adverse cardiac event rates (0.52% versus 0.44%; P=0.96). In these patients, no adverse event occurred when clinical pathways were correctly followed. Conclusions: Implementation of clinical pathways for suspected ACS reduced the length of stay and increased the proportions of patients safely discharged within 6 hours. Clinical Trail Registration: URL: https://www.anzctr.org.au/ (Australian and New Zealand Clinical Trials Registry). Unique identifier: ACTRN12617000381381

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    No full text
    corecore