3,165 research outputs found

    A scoping review of metamodeling applications and opportunities for advanced health economic analyses

    Get PDF
    Introduction: Metamodels, also known as meta-models, surrogate models, or emulators, are used in several fields of research to negate runtime issues with analyzing computational demanding simulation models. This study introduces metamodeling and presents results of a review on metamodeling applications in health economics. Areas covered: A scoping review was performed to identify studies that applied metamodeling methods in a health economic context. After search and selection, 13 publications were found to employ metamodeling methods in health economics. Metamodels were used to perform value of information analysis (n = 5, 38%), deterministic sensitivity analysis (n = 4, 31%), model calibration (n = 1, 8%), probabilistic sensitivity analysis (n = 1), or optimization (n = 1, 8%). One study was found to extrapolate a simulation model to other countries (n = 1, 8%). Applied metamodeling techniques varied considerably between studies, with linear regression being most frequently applied (n = 7, 54%). Expert commentary: Although it has great potential to enable computational demanding analyses of health economic models, metamodeling in health economics is still in its infancy, as illustrated by the limited number of applications and the relatively simple metamodeling methods applied. Comprehensive guidance specific to health economics is needed to provide modelers with the information and tools needed to utilize the full potential of metamodels

    Sensitivity analysis of expensive black-box systems using metamodeling

    Get PDF
    Simulations are becoming ever more common as a tool for designing complex products. Sensitivity analysis techniques can be applied to these simulations to gain insight, or to reduce the complexity of the problem at hand. However, these simulators are often expensive to evaluate and sensitivity analysis typically requires a large amount of evaluations. Metamodeling has been successfully applied in the past to reduce the amount of required evaluations for design tasks such as optimization and design space exploration. In this paper, we propose a novel sensitivity analysis algorithm for variance and derivative based indices using sequential sampling and metamodeling. Several stopping criteria are proposed and investigated to keep the total number of evaluations minimal. The results show that both variance and derivative based techniques can be accurately computed with a minimal amount of evaluations using fast metamodels and FLOLA-Voronoi or density sequential sampling algorithms.Comment: proceedings of winter simulation conference 201

    Metamodel variability analysis combining bootstrapping and validation techniques

    Get PDF
    Research on metamodel-based optimization has received considerably increasing interest in recent years, and has found successful applications in solving computationally expensive problems. The joint use of computer simulation experiments and metamodels introduces a source of uncertainty that we refer to as metamodel variability. To analyze and quantify this variability, we apply bootstrapping to residuals derived as prediction errors computed from cross-validation. The proposed method can be used with different types of metamodels, especially when limited knowledge on parameters’ distribution is available or when a limited computational budget is allowed. Our preliminary experiments based on the robust version of the EOQ model show encouraging results

    Comparing Strategies to Prevent Stroke and Ischemic Heart Disease in the Tunisian Population: Markov Modeling Approach Using a Comprehensive Sensitivity Analysis Algorithm.

    Get PDF
    Background. Mathematical models offer the potential to analyze and compare the effectiveness of very different interventions to prevent future cardiovascular disease. We developed a comprehensive Markov model to assess the impact of three interventions to reduce ischemic heart diseases (IHD) and stroke deaths: (i) improved medical treatments in acute phase, (ii) secondary prevention by increasing the uptake of statins, (iii) primary prevention using health promotion to reduce dietary salt consumption. Methods. We developed and validated a Markov model for the Tunisian population aged 35–94 years old over a 20-year time horizon. We compared the impact of specific treatments for stroke, lifestyle, and primary prevention on both IHD and stroke deaths. We then undertook extensive sensitivity analyses using both a probabilistic multivariate approach and simple linear regression (metamodeling). Results. The model forecast a dramatic mortality rise, with 111,134 IHD and stroke deaths (95% CI 106567 to 115048) predicted in 2025 in Tunisia. The salt reduction offered the potentially most powerful preventive intervention that might reduce IHD and stroke deaths by 27% (−30240 [−30580 to −29900]) compared with 1% for medical strategies and 3% for secondary prevention. The metamodeling highlighted that the initial development of a minor stroke substantially increased the subsequent probability of a fatal stroke or IHD death. Conclusions. The primary prevention of cardiovascular disease via a reduction in dietary salt consumption appeared much more effective than secondary or tertiary prevention approaches. Our simple but comprehensive model offers a potentially attractive methodological approach that might now be extended and replicated in other contexts and populations

    P ORTOLAN: a Model-Driven Cartography Framework

    Get PDF
    Processing large amounts of data to extract useful information is an essential task within companies. To help in this task, visualization techniques have been commonly used due to their capacity to present data in synthesized views, easier to understand and manage. However, achieving the right visualization display for a data set is a complex cartography process that involves several transformation steps to adapt the (domain) data to the (visualization) data format expected by visualization tools. To maximize the benefits of visualization we propose Portolan, a generic model-driven cartography framework that facilitates the discovery of the data to visualize, the specification of view definitions for that data and the transformations to bridge the gap with the visualization tools. Our approach has been implemented on top of the Eclipse EMF modeling framework and validated on three different use cases
    corecore