124 research outputs found

    Assurance calculations for planning clinical trials with time-to-event outcomes

    Get PDF
    We consider the use of the assurance method in clinical trial planning. In the assurance method, which is an alternative to a power calculation, we calculate the probability of a clinical trial resulting in a successful outcome, via eliciting a prior probability distribution about the relevant treatment effect. This is typically a hybrid Bayesian-frequentist procedure, in that it is usually assumed that the trial data will be analysed using a frequentist hypothesis test, so that the prior distribution is only used to calculate the probability of observing the desired outcome in the frequentist test. We argue that assessing the probability of a successful clinical trial is a useful part of the trial planning process. We develop assurance methods to accommodate survival outcome measures, assuming both parametric and nonparametric models. We also develop prior elicitation procedures for each survival model so that the assurance calculations can be performed more easily and reliably. We have made free software available for implementing our methods

    Interventions to reduce the risk of surgically transmitted Creutzfeldt–Jakob disease: a cost-effective modelling review

    Get PDF
    Background Creutzfeldt–Jakob disease is a fatal neurological disease caused by abnormal infectious proteins called prions. Prions that are present on surgical instruments cannot be completely deactivated; therefore, patients who are subsequently operated on using these instruments may become infected. This can result in surgically transmitted Creutzfeldt–Jakob disease. Objective To update literature reviews, consultation with experts and economic modelling published in 2006, and to provide the cost-effectiveness of strategies to reduce the risk of surgically transmitted Creutzfeldt–Jakob disease. Methods Eight systematic reviews were undertaken for clinical parameters. One review of cost-effectiveness was undertaken. Electronic databases including MEDLINE and EMBASE were searched from 2005 to 2017. Expert elicitation sessions were undertaken. An advisory committee, convened by the National Institute for Health and Care Excellence to produce guidance, provided an additional source of information. A mathematical model was updated focusing on brain and posterior eye surgery and neuroendoscopy. The model simulated both patients and instrument sets. Assuming that there were potentially 15 cases of surgically transmitted Creutzfeldt–Jakob disease between 2005 and 2018, approximate Bayesian computation was used to obtain samples from the posterior distribution of the model parameters to generate results. Heuristics were used to improve computational efficiency. The modelling conformed to the National Institute for Health and Care Excellence reference case. The strategies evaluated included neither keeping instruments moist nor prohibiting set migration; ensuring that instruments were kept moist; prohibiting instrument migration between sets; and employing single-use instruments. Threshold analyses were undertaken to establish prices at which single-use sets or completely effective decontamination solutions would be cost-effective. Results A total of 169 papers were identified for the clinical review. The evidence from published literature was not deemed sufficiently strong to take precedence over the distributions obtained from expert elicitation. Forty-eight papers were identified in the review of cost-effectiveness. The previous modelling structure was revised to add the possibility of misclassifying surgically transmitted Creutzfeldt–Jakob disease as another neurodegenerative disease, and assuming that all patients were susceptible to infection. Keeping instruments moist was estimated to reduce the risk of surgically transmitted Creutzfeldt–Jakob disease cases and associated costs. Based on probabilistic sensitivity analyses, keeping instruments moist was estimated to on average result in 2.36 (range 0–47) surgically transmitted Creutzfeldt–Jakob disease cases (across England) caused by infection occurring between 2019 and 2023. Prohibiting set migration or employing single-use instruments reduced the estimated risk of surgically transmitted Creutzfeldt–Jakob disease cases further, but at considerable cost. The estimated costs per quality-adjusted life-year gained of these strategies in addition to keeping instruments moist were in excess of £1M. It was estimated that single-use instrument sets (currently £350–500) or completely effective cleaning solutions would need to cost approximately £12 per patient to be cost-effective using a £30,000 per quality-adjusted life-year gained value. Limitations As no direct published evidence to implicate surgery as a cause of Creutzfeldt–Jakob disease has been found since 2005, the estimations of potential cases from elicitation are still speculative. A particular source of uncertainty was in the number of potential surgically transmitted Creutzfeldt–Jakob disease cases that may have occurred between 2005 and 2018. Conclusions Keeping instruments moist is estimated to reduce the risk of surgically transmitted Creutzfeldt–Jakob disease cases and associated costs. Further surgical management strategies can reduce the risks of surgically transmitted Creutzfeldt–Jakob disease but have considerable associated costs. Study registration This study is registered as PROSPERO CRD42017071807. Funding This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 24, No. 11. See the NIHR Journals Library website for further project information

    Toward a unified framework for model calibration and optimisation in virtual engineering workflows

    Get PDF
    When designing a new product it is often advantageous to use virtual engineering as either a replacement or assistant to more traditional prototyping. Virtual engineering consists of two main stages: (i) development of the simulation model; (ii) use of the model in design optimisation. There is a vast literature on both of these stages in isolation but virtually no studies have considered them in combination. The model calibration and design optimisation processes both however, crucially, draw on the same resource budget for simulation evaluations. When evaluations are expensive, there may be advantages in treating the two stages as combined. This study lays out a joint framework by which such problems can be expressed through a unified mathematical notation. A previously published case study is reviewed within the context of this framework, and directions for further development are discussed

    Incorporating genuine prior information about between-study heterogeneity in random effects pairwise and network meta-analyses

    Get PDF
    Background: Pairwise and network meta-analyses using fixed effect and random effects models are commonly applied to synthesise evidence from randomised controlled trials. The models differ in their assumptions and the interpretation of the results. The model choice depends on the objective of the analysis and knowledge of the included studies. Fixed effect models are often used because there are too few studies with which to estimate the between-study standard deviation from the data alone. Objectives: The aim is to propose a framework for eliciting an informative prior distribution for the between-study standard deviation in a Bayesian random effects meta-analysis model to genuinely represent heterogeneity when data are sparse. Methods: We developed an elicitation method using external information such as empirical evidence and experts' beliefs on the 'range' of treatment effects in order to infer the prior distribution for the between-study standard deviation. We also developed the method to be implemented in R. Results: The three-stage elicitation approach allows uncertainty to be represented by a genuine prior distribution to avoid making misleading inferences. It is flexible to what judgments an expert can provide, and is applicable to all types of outcome measure for which a treatment effect can be constructed on an additive scale. Conclusions: The choice between using a fixed effect or random effects meta-analysis model depends on the inferences required and not on the number of available studies. Our elicitation framework captures external evidence about heterogeneity and overcomes the often implausible assumption that studies are estimating the same treatment effect, thereby improving the quality of inferences in decision making

    Component-level study of a decomposition-based multi-objective optimizer on a limited evaluation budget

    Get PDF
    Decomposition-based algorithms have emerged as one of the most popular classes of solvers for multi-objective optimization. Despite their popularity, a lack of guidance exists for how to configure such algorithms for real-world problems, based on the features or contexts of those problems. One context that is important for many real-world problems is that function evaluations are expensive, and so algorithms need to be able to provide adequate convergence on a limited budget (e.g. 500 evaluations). This study contributes to emerging guidance on algorithm configuration by investigating how the convergence of the popular decomposition-based optimizer MOEA/D, over a limited budget, is affected by choice of component level configuration. Two main aspects are considered: (1) impact of sharing information; (2) impact of normalisation scheme. The empirical test framework includes detailed trajectory analysis, as well as more conventional performance indicator analysis, to help identify and explain the behaviour of the optimizer. Use of neighbours in generating new solutions is found to be highly disruptive for searching on a small budget, leading to better convergence in some areas but far worse convergence in others. The findings also emphasise the challenge and importance of using an appropriate normalisation scheme

    Efficient History Matching of a High Dimensional Individual-Based HIV Transmission Model

    Get PDF
    History matching is a model (pre-)calibration method that has been applied to computer models from a wide range of scientific disciplines. In this work we apply history matching to an individual-based epidemiological model of HIV that has 96 input and 50 output parameters, a model of much larger scale than others that have been calibrated before using this or similar methods. Apart from demonstrating that history matching can analyze models of this complexity, a central contribution of this work is that the history match is carried out using linear regression, a statistical tool that is elementary and easier to implement than the Gaussian process--based emulators that have previously been used. Furthermore, we address a practical difficulty with history matching, namely, the sampling of tiny, nonimplausible spaces, by introducing a sampling algorithm adjusted to the specific needs of this method. The effectiveness and simplicity of the history matching method presented here shows that it is a useful tool for the calibration of computationally expensive, high dimensional, individual-based models

    Bayesian history matching of complex infectious disease models using emulation: A tutorial and a case study on HIV in Uganda

    Get PDF
    Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator's input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator's behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs

    Gaussian process manifold interpolation for probabilistic atrial activation maps and uncertain conduction velocity

    Get PDF
    In patients with atrial fibrillation, local activation time (LAT) maps are routinely used for characterizing patient pathophysiology. The gradient of LAT maps can be used to calculate conduction velocity (CV), which directly relates to material conductivity and may provide an important measure of atrial substrate properties. Including uncertainty in CV calculations would help with interpreting the reliability of these measurements. Here, we build upon a recent insight into reduced-rank Gaussian processes (GPs) to perform probabilistic interpolation of uncertain LAT directly on human atrial manifolds. Our Gaussian process manifold interpolation (GPMI) method accounts for the topology of the atrium, and allows for calculation of statistics for predicted CV. We demonstrate our method on two clinical cases, and perform validation against a simulated ground truth. CV uncertainty depends on data density, wave propagation direction and CV magnitude. GPMI is suitable for probabilistic interpolation of other uncertain quantities on non-Euclidean manifolds. This article is part of the theme issue ‘Uncertainty quantification in cardiac and cardiovascular modelling and simulation’

    Sequential design of computer experiments for the estimation of a probability of failure

    Full text link
    This paper deals with the problem of estimating the volume of the excursion set of a function f:Rd→Rf:\mathbb{R}^d \to \mathbb{R} above a given threshold, under a probability measure on Rd\mathbb{R}^d that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classical Monte Carlo methods ought to be avoided. One of the main contributions of this article is to derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. These sequential strategies use a Gaussian process model of ff and aim at performing evaluations of ff as efficiently as possible to infer the value of the probability of failure. We compare these strategies to other strategies also based on a Gaussian process model for estimating a probability of failure.Comment: This is an author-generated postprint version. The published version is available at http://www.springerlink.co
    • …
    corecore