21 research outputs found

    Cervical ripening at home or in-hospital-prospective cohort study and process evaluation (CHOICE) study: a protocol.

    Get PDF
    IntroductionThe aim of the cervical ripening at home or in-hospital-prospective cohort study and process evaluation (CHOICE) study is to compare home versus in-hospital cervical ripening to determine whether home cervical ripening is safe (for the primary outcome of neonatal unit (NNU) admission), acceptable to women and cost-effective from the perspective of both women and the National Health Service (NHS).Methods and analysisWe will perform a prospective multicentre observational cohort study with an internal pilot phase. We will obtain data from electronic health records from at least 14 maternity units offering only in-hospital cervical ripening and 12 offering dinoprostone home cervical ripening. We will also conduct a cost-effectiveness analysis and a mixed methods study to evaluate processes and women/partner experiences. Our primary sample size is 8533 women with singleton pregnancies undergoing induction of labour (IOL) at 39+0 weeks' gestation or more. To achieve this and contextualise our findings, we will collect data relating to a cohort of approximately 41 000 women undergoing IOL after 37 weeks. We will use mixed effects logistic regression for the non-inferiority comparison of NNU admission and propensity score matched adjustment to control for treatment indication bias. The economic analysis will be undertaken from the perspective of the NHS and Personal Social Services (PSS) and the pregnant woman. It will include a within-study cost-effectiveness analysis and a lifetime cost-utility analysis to account for any long-term impacts of the cervical ripening strategies. Outcomes will be reported as incremental cost per NNU admission avoided and incremental cost per quality adjusted life year gained.Research ethics approval and disseminationCHOICE has been funded and approved by the National Institute of Healthcare Research Health Technology and Assessment, and the results will be disseminated via publication in peer-reviewed journals.Trial registration numberISRCTN32652461

    Real world hospital costs following stress echocardiography in the UK: a costing study from the EVAREST/BSE-NSTEP multi-entre study

    Get PDF
    Background: Stress echocardiography is widely used to detect coronary artery disease, but little evidence on downstream hospital costs in real-world practice is available. We examined how stress echocardiography accuracy and downstream hospital costs vary across NHS hospitals and identifed key factors that afect costs to help inform future clinical planning and guidelines. Methods: Data on 7636 patients recruited from 31 NHS hospitals within the UK between 2014 and 2020 as part of EVAREST/BSE-NSTEP clinical study, were used. Data included all diagnostic tests, procedures, and hospital admissions for 12 months after a stress echocardiogram and were costed using the NHS national unit costs. A decision tree was built to illustrate the clinical pathway and estimate average downstream hospital costs. Multi-level regression analysis was performed to identify variation in accuracy and costs at both patient, procedural, and hospital level. Linear regression and extrapolation were used to estimate annual hospital cost-savings associated with increasing predictive accuracy at hospital and national level. Results: Stress echocardiography accuracy varied with patient, hospital and operator characteristics. Hypertension, presence of wall motion abnormalities and higher number of hospital cardiology outpatient attendances annually reduced accuracy, adjusted odds ratio of 0.78 (95% CI 0.65 to 0.93), 0.27 (95% CI 0.15 to 0.48), 0.99 (95% CI 0.98 to 0.99) respectively, whereas a prior myocardial infarction, angiotensin receptor blocker medication, and greater operator experience increased accuracy, adjusted odds ratio of 1.77 (95% CI 1.34 to 2.33), 1.64 (95% CI 1.22 to 2.22), and 1.06 (95% CI 1.02 to 1.09) respectively. Average downstream costs were £646 per patient (SD 1796) with signifcant variation across hospitals. The average downstream costs between the 31 hospitals varied from £384–1730 per patient. False positive and false negative tests were associated with average downstream costs of £1446 (SD £601) and £4192 (SD 3332) respectively, driven by increased non-elective hospital admissions, adjusted odds ratio 2.48 (95% CI 1.08 to 5.66), 21.06 (95% CI 10.41 to 42.59) respectively. We estimated that an increase in accuracy by 1 percentage point could save the NHS in the UK £3.2 million annually. Conclusion: This study provides real-world evidence of downstream costs associated with stress echocardiography practice in the UK and estimates how improvements in accuracy could impact healthcare expenditure in the NHS. A real-world downstream costing approach could be adopted more widely in evaluation of imaging tests and interventions to refect actual value for money and support realistic planning

    A comprehensive evaluation of food fortification with folic acid for the primary prevention of neural tube defects

    Get PDF
    BACKGROUND: Periconceptional use of vitamin supplements containing folic acid reduces the risk of a neural tube defect (NTD). In November 1998, food fortification with folic acid was mandated in Canada, as a public health strategy to increase the folic acid intake of all women of childbearing age. We undertook a comprehensive population based study in Newfoundland to assess the benefits and possible adverse effects of this intervention. METHODS: This study was carried out in women aged 19–44 years and in seniors from November 1997 to March 1998, and from November 2000 to March 2001. The evaluation was comprised of four components: I) Determination of rates of NTDs; II) Dietary assessment; III) Blood analysis; IV) Assessment of knowledge and use of folic acid supplements. RESULTS: The annual rates of NTDs in Newfoundland varied greatly between 1976 and 1997, with a mean rate of 3.40 per 1,000 births. There was no significant change in the average rates between 1991–93 and 1994–97 (relative risk [RR] 1.01, 95% confidence interval [CI] 0.76–1.34). The rates of NTDs fell by 78% (95% CI 65%–86%) after the implementation of folic acid fortification, from an average of 4.36 per 1,000 births during 1991–1997 to 0.96 per 1,000 births during 1998–2001 (RR 0.22, 95% CI 0.14–0.35). The average dietary intake of folic acid due to fortification was 70 μg/day in women aged 19–44 years and 74 μg/day in seniors. There were significant increases in serum and RBC folate levels for women and seniors after mandatory fortification. Among seniors, there were no significant changes in indices typical of vitamin B(12 )deficiencies, and no evidence of improved folate status masking haematological manifestations of vitamin B(12 )deficiency. The proportion of women aged 19–44 years taking a vitamin supplement containing folic acid increased from 17% to 28%. CONCLUSIONS: Based on these findings, mandatory food fortification in Canada should continue at the current levels. Public education regarding folic acid supplement use by women of childbearing age should also continue

    A comprehensive evaluation of food fortification with folic acid for the primary prevention of neural tube defects

    No full text
    Abstract Background Periconceptional use of vitamin supplements containing folic acid reduces the risk of a neural tube defect (NTD). In November 1998, food fortification with folic acid was mandated in Canada, as a public health strategy to increase the folic acid intake of all women of childbearing age. We undertook a comprehensive population based study in Newfoundland to assess the benefits and possible adverse effects of this intervention. Methods This study was carried out in women aged 19–44 years and in seniors from November 1997 to March 1998, and from November 2000 to March 2001. The evaluation was comprised of four components: I) Determination of rates of NTDs; II) Dietary assessment; III) Blood analysis; IV) Assessment of knowledge and use of folic acid supplements. Results The annual rates of NTDs in Newfoundland varied greatly between 1976 and 1997, with a mean rate of 3.40 per 1,000 births. There was no significant change in the average rates between 1991–93 and 1994–97 (relative risk [RR] 1.01, 95% confidence interval [CI] 0.76–1.34). The rates of NTDs fell by 78% (95% CI 65%–86%) after the implementation of folic acid fortification, from an average of 4.36 per 1,000 births during 1991–1997 to 0.96 per 1,000 births during 1998–2001 (RR 0.22, 95% CI 0.14–0.35). The average dietary intake of folic acid due to fortification was 70 μg/day in women aged 19–44 years and 74 μg/day in seniors. There were significant increases in serum and RBC folate levels for women and seniors after mandatory fortification. Among seniors, there were no significant changes in indices typical of vitamin B12 deficiencies, and no evidence of improved folate status masking haematological manifestations of vitamin B12 deficiency. The proportion of women aged 19–44 years taking a vitamin supplement containing folic acid increased from 17% to 28%. Conclusions Based on these findings, mandatory food fortification in Canada should continue at the current levels. Public education regarding folic acid supplement use by women of childbearing age should also continue

    Sparse feature selection for classification and prediction of metastasis in endometrial cancer

    Get PDF
    Abstract Background Metastasis via pelvic and/or para-aortic lymph nodes is a major risk factor for endometrial cancer. Lymph-node resection ameliorates risk but is associated with significant co-morbidities. Incidence in patients with stage I disease is 4–22% but no mechanism exists to accurately predict it. Therefore, national guidelines for primary staging surgery include pelvic and para-aortic lymph node dissection for all patients whose tumor exceeds 2cm in diameter. We sought to identify a robust molecular signature that can accurately classify risk of lymph node metastasis in endometrial cancer patients. 86 tumors matched for age and race, and evenly distributed between lymph node-positive and lymph node-negative cases, were selected as a training cohort. Genomic micro-RNA expression was profiled for each sample to serve as the predictive feature matrix. An independent set of 28 tumor samples was collected and similarly characterized to serve as a test cohort. Results A feature selection algorithm was designed for applications where the number of samples is far smaller than the number of measured features per sample. A predictive miRNA expression signature was developed using this algorithm, which was then used to predict the metastatic status of the independent test cohort. A weighted classifier, using 18 micro-RNAs, achieved 100% accuracy on the training cohort. When applied to the testing cohort, the classifier correctly predicted 90% of node-positive cases, and 80% of node-negative cases (FDR = 6.25%). Conclusion Results indicate that the evaluation of the quantitative sparse-feature classifier proposed here in clinical trials may lead to significant improvement in the prediction of lymphatic metastases in endometrial cancer patients
    corecore