1,074 research outputs found

    Bayesian meta-analytical methods to incorporate multiple surrogate endpoints in drug development process.

    Get PDF
    A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression

    Comparing associations between frailty and mortality in hospitalised older adults with or without COVID-19 infection: a retrospective observational study using electronic health records

    Get PDF
    BACKGROUND: The aim of this study was to describe outcomes in hospitalised older people with different levels of frailty and COVID-19 infection. METHODS: We undertook a single centre, retrospective cohort study examining COVID-19 related mortality using Electronic Health Records, for older people (65 and over) with frailty, hospitalised with or without COVID-19 infection. Baseline covariates included demographics, Early Warning Scores, Charlson Comorbidity Indices and frailty (Clinical Frailty Scale, CFS), linked to COVID-19 status. FINDINGS: We analysed outcomes on 1,071 patients with COVID-19 test results; 285 (27%) were positive for COVID-19.)The mean age at ED arrival was 79.7 and 49.4% were female. All-cause mortality (by 30 days) rose from 9% (not frail) through to 33% (severely frail) in the COVID negative cohort but was around 60% for all frailty categories in the COVID positive cohort. In adjusted analyses, the hazard ratio for death in those with COVID-19 compared to those without COVID-19 was 7.3, 95% CI: 3.00, 18.0) with age, comorbidities and illness severity making small additional contributions. INTERPRETATION: In this study frailty, measured using the Clinical Frailty Scale, appeared to make little incremental contribution to the hazard of dying in older people hospitalised with COVID-19 infection; illness severity and comorbidity had a modest association with the overall adjusted hazard of death, whereas confirmed COVID-19 infection dominated, with a seven-fold hazard for death

    Using electronic health records to predict costs and outcomes in stable coronary artery disease

    Get PDF
    OBJECTIVES: To use electronic health records (EHR) to predict lifetime costs and health outcomes of patients with stable coronary artery disease (stable-CAD) stratified by their risk of future cardiovascular events, and to evaluate the cost-effectiveness of treatments targeted at these populations. METHODS: The analysis was based on 94 966 patients with stable-CAD in England between 2001 and 2010, identified in four prospectively collected, linked EHR sources. Markov modelling was used to estimate lifetime costs and quality-adjusted life years (QALYs) stratified by baseline cardiovascular risk. RESULTS: For the lowest risk tenth of patients with stable-CAD, predicted discounted remaining lifetime healthcare costs and QALYs were £62 210 (95% CI £33 724 to £90 043) and 12.0 (95% CI 11.5 to 12.5) years, respectively. For the highest risk tenth of the population, the equivalent costs and QALYs were £35 549 (95% CI £31 679 to £39 615) and 2.9 (95% CI 2.6 to 3.1) years, respectively. A new treatment with a hazard reduction of 20% for myocardial infarction, stroke and cardiovascular disease death and no side-effects would be cost-effective if priced below £72 per year for the lowest risk patients and £646 per year for the highest risk patients. CONCLUSIONS: Existing EHRs may be used to estimate lifetime healthcare costs and outcomes of patients with stable-CAD. The stable-CAD model developed in this study lends itself to informing decisions about commissioning, pricing and reimbursement. At current prices, to be cost-effective some established as well as future stable-CAD treatments may require stratification by patient risk

    Long-term healthcare use and costs in patients with stable coronary artery disease: a population-based cohort using linked health records (CALIBER)

    Get PDF
    AIMS: To examine long-term healthcare utilization and costs of patients with stable coronary artery disease (SCAD). METHODS AND RESULTS: Linked cohort study of 94 966 patients with SCAD in England, 1 January 2001 to 31 March 2010, identified from primary care, secondary care, disease, and death registries. Resource use and costs, and cost predictors by time and 5-year cardiovascular disease (CVD) risk profile were estimated using generalized linear models. Coronary heart disease hospitalizations were 20.5% in the first year and 66% in the year following a non-fatal (myocardial infarction, ischaemic or haemorrhagic stroke) event. Mean healthcare costs were £3133 per patient in the first year and £10 377 in the year following a non-fatal event. First-year predictors of cost included sex (mean cost £549 lower in females), SCAD diagnosis (non-ST-elevation myocardial infarction cost £656 more than stable angina), and co-morbidities (heart failure cost £657 more per patient). Compared with lower risk patients (5-year CVD risk 3.5%), those of higher risk (5-year CVD risk 44.2%) had higher 5-year costs (£23 393 vs. £9335) and lower lifetime costs (£43 020 vs. £116 888). CONCLUSION: Patients with SCAD incur substantial healthcare utilization and costs, which varies and may be predicted by 5-year CVD risk profile. Higher risk patients have higher initial but lower lifetime costs than lower risk patients as a result of shorter life expectancy. Improved cardiovascular survivorship among an ageing CVD population is likely to require stratified care in anticipation of the burgeoning demand

    Solving Quantum Ground-State Problems with Nuclear Magnetic Resonance

    Get PDF
    Quantum ground-state problems are computationally hard problems; for general many-body Hamiltonians, there is no classical or quantum algorithm known to be able to solve them efficiently. Nevertheless, if a trial wavefunction approximating the ground state is available, as often happens for many problems in physics and chemistry, a quantum computer could employ this trial wavefunction to project the ground state by means of the phase estimation algorithm (PEA). We performed an experimental realization of this idea by implementing a variational-wavefunction approach to solve the ground-state problem of the Heisenberg spin model with an NMR quantum simulator. Our iterative phase estimation procedure yields a high accuracy for the eigenenergies (to the 10^-5 decimal digit). The ground-state fidelity was distilled to be more than 80%, and the singlet-to-triplet switching near the critical field is reliably captured. This result shows that quantum simulators can better leverage classical trial wavefunctions than classical computers.Comment: 11 pages, 13 figure

    Causal inference for long-term survival in randomised trials with treatment switching: Should re-censoring be applied when estimating counterfactual survival times?

    Get PDF
    Treatment switching often has a crucial impact on estimates of effectiveness and cost-effectiveness of new oncology treatments. Rank preserving structural failure time models (RPSFTM) and two-stage estimation (TSE) methods estimate ‘counterfactual’ (i.e. had there been no switching) survival times and incorporate re-censoring to guard against informative censoring in the counterfactual dataset. However, re-censoring causes a loss of longer term survival information which is problematic when estimates of long-term survival effects are required, as is often the case for health technology assessment decision making. We present a simulation study designed to investigate applications of the RPSFTM and TSE with and without re-censoring, to determine whether re-censoring should always be recommended within adjustment analyses. We investigate a context where switching is from the control group onto the experimental treatment in scenarios with varying switch proportions, treatment effect sizes and time-dependencies, disease severity and switcher prognosis. Methods were assessed according to their estimation of control group restricted mean survival (that would be observed in the absence of switching) at the end of the simulated trial follow-up. We found that RPSFTM and TSE analyses which incorporated re-censoring usually produced negative bias (i.e. under-estimating control group restricted mean survival and therefore over-estimating the treatment effect). RPSFTM and TSE analyses that did not incorporate re-censoring consistently produced positive bias (i.e. under-estimating the treatment effect) which was often smaller in magnitude than the bias associated with the re-censored analyses. We believe that analyses should be conducted with and without re-censoring, as this may provide decision makers with useful information on where the true treatment effect is likely to lie. Analyses that incorporate re-censoring should not always represent the default approach when the objective is to estimate long-term survival times and treatment effects on long-term survival

    Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review

    Get PDF
    Background: Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. Methods: We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. Results: For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Conclusions: Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses

    The groupoid approach to Leavitt path algebras

    Get PDF
    When the theory of Leavitt path algebras was already quite advanced, it was discovered that some of the more difficult questions were susceptible to a new approach using topological groupoids. The main result that makes this possible is that the Leavitt path algebra of a graph is graded isomorphic to the Steinberg algebra of the graph’s boundary path groupoid. This expository paper has three parts: Part 1 is on the Steinberg algebra of a groupoid, Part 2 is on the path space and boundary path groupoid of a graph, and Part 3 is on the Leavitt path algebra of a graph. It is a self-contained reference on these topics, intended to be useful to beginners and experts alike. While revisiting the fundamentals, we prove some results in greater generality than can be found elsewhere, including the uniqueness theorems for Leavitt path algebras
    corecore