912 research outputs found

    Weekend hospitalization and additional risk of death: An analysis of inpatient data

    Get PDF
    Objective To assess whether weekend admissions to hospital and/or already being an inpatient on weekend days were associated with any additional mortality risk.Design Retrospective observational survivorship study. We analysed all admissions to the English National Health Service (NHS) during the financial year 2009/10, following up all patients for 30 days after admission and accounting for risk of death associated with diagnosis, co-morbidities, admission history, age, sex, ethnicity, deprivation, seasonality, day of admission and hospital trust, including day of death as a time dependent covariate. The principal analysis was based on time to in-hospital death.Participants National Health Service Hospitals in England.Main Outcome Measures 30 day mortality (in or out of hospital).Results There were 14,217,640 admissions included in the principal analysis, with 187,337 in-hospital deaths reported within 30 days of admission. Admission on weekend days was associated with a considerable increase in risk of subsequent death compared with admission on weekdays, hazard ratio for Sunday versus Wednesday 1.16 (95% CI 1.14 to 1.18; P < .0001), and for Saturday versus Wednesday 1.11 (95% CI 1.09 to 1.13; P < .0001). Hospital stays on weekend days were associated with a lower risk of death than midweek days, hazard ratio for being in hospital on Sunday versus Wednesday 0.92 (95% CI 0.91 to 0.94; P < .0001), and for Saturday versus Wednesday 0.95 (95% CI 0.93 to 0.96; P < .0001). Similar findings were observed on a smaller US data set.Conclusions Admission at the weekend is associated with increased risk of subsequent death within 30 days of admission. The likelihood of death actually occurring is less on a weekend day than on a mid-week day

    OnabotulinumtoxinA in the treatment of overactive bladder: a cost-effectiveness analysis versus best supportive care in England and Wales

    Get PDF
    The cost-effectiveness of onabotulinumtoxinA (BOTOX®) 100 U + best supportive care (BSC) was compared with BSC alone in the management of idiopathic overactive bladder in adult patients who are not adequately managed with anticholinergics. BSC included incontinence pads and, for a proportion of patients, anticholinergics and/or occasional clean intermittent catheterisation. A five-state Markov model was used to estimate total costs and outcomes over a 10-year period. The cohort was based on data from two placebo-controlled trials and a long-term extension study of onabotulinumtoxinA. After discontinuation of initial treatment, a proportion of patients progressed to downstream sacral nerve stimulation (SNS). Cost and resource use was estimated from a National Health Service perspective in England and Wales using relevant reference sources for 2012 or 2013. Results showed that onabotulinumtoxinA was associated with lower costs and greater health benefits than BSC in the base case, with probabilistic sensitivity analysis indicating an 89 % probability that the incremental cost-effectiveness ratio would fall below £20,000. OnabotulinumtoxinA remained dominant over BSC in all but two scenarios tested; it was also economically dominant when compared directly with SNS therapy. In conclusion, onabotulinumtoxinA appears to be a cost-effective treatment for overactive bladder compared with BSC alone

    Autism and the U.K. secondary school experience

    Get PDF
    This research investigated the self-reported mainstream school experiences of those diagnosed on the autistic spectrum compared with the typically developing school population. Existing literature identifies four key areas that affect the quality of the school experience for students with autism: social skills, perceived relationships with teaching staff, general school functioning, and interpersonal strengths of the young person. These areas were explored in a mainstream U.K. secondary school with 14 students with autism and 14 age and gender matched students without autism, using self-report questionnaires and semi-structured interviews. Quantitative analyses showed consistent school experiences for both groups, although content analysis of interview data highlighted some differences in the ways in which the groups perceive group work, peers, and teaching staff within school. Implications for school inclusion are discussed, drawing attention to how staff awareness of autism could improve school experience and success for students with autism attending mainstream schools

    Clinical outcomes in high-hypoglycaemia-risk patients with type 2 diabetes switching to insulin glargine 300 U/mL versus a first-generation basal insulin analogue in the United States: Results from the DELIVER High Risk real-world study

    Get PDF
    Aims: To compare 12-month clinical effectiveness of insulin glargine 300 units/mL (Gla-300) versus first-generation basal insulin analogues (BIAs) (insulin glargine 100 units/mL [Gla-100] or insulin detemir [IDet]) in patients with type 2 diabetes (T2D) who were at high risk of hypoglycaemia and switched from one BIA to a different one (Gla-300 or Gla-100/IDet) in a real-world setting. // Methods: DELIVER High Risk was a retrospective observational cohort study of 2550 patients with T2D who switched BIA to Gla-300 (Gla-300 switchers) and were propensity score-matched (1:1) to patients who switched to Gla-100 or IDet (Gla-100/IDet switchers). Outcomes were change in glycated haemoglobin A1c (HbA1c), attainment of HbA1c goals (<7% and <8%), and incidence and event rates of hypoglycaemia (all-hypoglycaemia and hypoglycaemia associated with an inpatient/emergency department [ED] contact). // Results: HbA1c reductions were similar following switching to Gla-300 or Gla-100/IDet (−0.51% vs. −0.53%; p = .67), and patients showed similar attainment of HbA1c goals. Patients in both cohorts had comparable all-hypoglycaemia incidence and event rates. However, the Gla-300 switcher cohort had a significantly lower risk of inpatient/ED-associated hypoglycaemia (adjusted odds ratio: 0.73, 95% confidence interval: 0.60–0.89; p = .002) and experienced significantly fewer inpatient/ED-associated hypoglycaemic events (0.21 vs. 0.33 events per patient per year; p < .001). // Conclusion: In patients with T2D at high risk of hypoglycaemia, switching to Gla-300 or Gla-100/IDet achieved similar HbA1c reductions and glycaemic goal attainment, but Gla-300 switchers had a significantly lower risk of hypoglycaemia associated with an inpatient/ED contact during 12 months after switching

    Improved social functioning following social recovery therapy in first episode psychosis: Do social cognition and neurocognition change following therapy, and do they predict treatment response?

    Get PDF
    There is a need to develop and refine psychosocial interventions to improve functioning in First Episode Psychosis (FEP). Social cognition and neurocognition are closely linked to functioning in psychosis; examinations of cognition pre- and post- psychosocial intervention may provide insights into the mechanisms of these interventions, and identify which individuals are most likely to benefit. Method: Cognition was assessed within a multi-site trial of Social Recovery Therapy (SRT) for individuals with FEP experiencing poor functioning (<30 h weekly structured activity). Fifty-nine participants were randomly allocated to the therapy group (SRT + Early intervention), and 64 were allocated to treatment as usual group (TAU - early intervention care). Social cognition and neurocognition were assessed at baseline and 9 months; assessors were blind to group allocation. It was hypothesized that social cognition would improve following therapy, and those with better social cognition prior to therapy would benefit the most from SRT. Results: There was no significant impact of SRT on individual neurocognitive or social cognitive variables, however, joint models addressing patterns of missingness demonstrate improvement across a number of cognitive outcomes following SRT. Further, regression analyses showed those who had better social cognition at baseline were most likely to benefit from the therapy (ß = 0.350; 95% CI = 0.830 to 8.891; p = .019). Conclusion: It is not clear if SRT impacts on social cognitive or neurocognitive function, however, SRT may be beneficial in those with better social cognition at baseline

    Long Term outcomes of percutaneous atrial fibrillation ablation in patients with continuous monitoring

    Get PDF
    INTRODUCTION: There is limited data using continuous monitoring to assess outcomes of atrial fibrillation (AF) ablation. This study assessed long-term outcomes of AF ablation in patients with implantable cardiac devices. METHODS: 207 patients (mean age 68.1 ± 9.5, 50.3% men) undergoing ablation for symptomatic AF were followed up for a mean period of 924.5 ± 636.7 days. Techniques included The Pulmonary Vein Ablation Catheter (PVAC) (59.4%), cryoablation (17.4%), point by point (14.0%) and The Novel Irrigated Multipolar Radiofrequency Ablation Catheter (nMARQ) (9.2%). RESULTS: 130 (62.8%) patients had paroxysmal AF (PAF) and 77 (37.2%) persistent AF. First ablation and repeat ablation reduced AF burden significantly (relative risk 0.91, [95% CI 0.89 to 0.94]; P <0.0001 and 0.90, [95% CI, 0.86-0.94]; P <0.0001). Median AF burden in PAF patients reduced from 1.05% (interquartile range [IQR], 0.1%-8.70%) to 0.10% ([IQR], 0%-2.28%) at one year and this was maintained out to four-years. Persistent AF burden reduced from 99.9% ([IQR], 51.53%-100%) to 0.30% ([IQR], 0%-77.25%) at one year increasing to 87.3% ([IQR], 4.25%-100%) after four years. If a second ablation was required, point-by-point ablation achieved greater reduction in AF burden (relative risk, 0.77 [95% CI, 0.65-0.91]; P <0.01). CONCLUSION: Ablation reduces AF burden both acutely and in the long-term. If a second ablation was required the point-by-point technique achieved greater reductions in AF burden than "single-shot" technologies. Persistent AF burden increased to near pre ablation levels by year 4 suggesting a different mechanism from PAF patients where this increase did not occur

    The effects of aetiology on outcome in patients treated with cardiac resynchronization therapy in the CARE-HF trial

    Get PDF
    Aims: Cardiac dyssynchrony is common in patients with heart failure, whether or not they have ischaemic heart disease (IHD). The effect of the underlying cause of cardiac dysfunction on the response to cardiac resynchronization therapy (CRT) is unknown. This issue was addressed using data from the CARE-HF trial.Methods and resultsPatients (n = 813) were grouped by heart failure aetiology (IHD n = 339 vs. non-IHD n = 473), and the primary composite (all-cause mortality or unplanned hospitalization for a major cardiovascular event) and principal secondary (all-cause mortality) endpoints analysed. Heart failure severity and the degree of dyssynchrony were compared between the groups by analysing baseline clinical and echocardiographic variables. Patients with IHD were more likely to be in NYHA class IV (7.5 vs. 4.0; P = 0.03) and to have higher NT-proBNP levels (2182 vs. 1725 pg/L), indicating more advanced heart failure. The degree of dyssynchrony was more pronounced in patients without IHD (assessed using mean QRS duration, interventricular mechanical delay, and aorta-pulmonary pre-ejection time). Left ventricular ejection fraction and left ventricular end-systolic volume improved to a lesser extent in the IHD group (4.53 vs. 8.50 and -35.68 vs. -58.52 cm 3). Despite these differences, CRT improved all-cause mortality, NYHA class, and hospitalization rates to a similar extent in patients with or without IHD.ConclusionThe benefits of CRT in patients with or without IHD were similar in relative terms in the CARE-HF study but as patients with IHD had a worse prognosis, the benefit in absolute terms may be greater

    A randomized sham-controlled study of pulmonary vein isolation in symptomatic atrial fibrillation (The SHAM-PVI study): Study design and rationale

    Get PDF
    INTRODUCTION: Pulmonary vein (PV) isolation has been shown to reduce atrial fibrillation (AF) burden and symptoms in patients. However, to date previous studies have been unblinded raising the possibility of a placebo effect to account for differences in outcomes. HYPOTHESIS & METHODS: The objective of this study is to compare PV isolation to a sham procedure in patients with symptomatic AF. The SHAM-PVI study is a double blind randomized controlled clinical trial. 140 patients with symptomatic paroxysmal or persistent AF will be randomized to either PV isolation (with cryoballoon ablation) or a sham procedure (with phrenic nerve pacing). All patients will receive an implantable loop recorder. The primary outcome is total AF burden at 6 months postrandomisation (excluding the 3 month blanking period). Key secondary outcomes include (1) time to symptomatic and asymptomatic atrial tachyarrhythmia (2) total atrial tachyarrhythmia episodes and (3) patient reported outcome measures. RESULTS: Enrollment was initiated in January 2020. Through April 2023 119 patients have been recruited. Results are expected to be disseminated in 2024. CONCLUSION: This study compares PV isolation using cryoablation to a sham procedure. The study will estimate the effect of PV isolation on AF burden
    corecore