6 research outputs found

    Global sensitivity analysis of stochastic computer models with joint metamodels

    Get PDF
    The global sensitivity analysis method used to quantify the influence of uncertain input variables on the variability in numerical model responses has already been applied to deterministic computer codes; deterministic means here that the same set of input variables gives always the same output value. This paper proposes a global sensitivity analysis methodology for stochastic computer codes, for which the result of each code run is itself random. The framework of the joint modeling of the mean and dispersion of heteroscedastic data is used. To deal with the complexity of computer experiment outputs, nonparametric joint models are discussed and a new Gaussian process-based joint model is proposed. The relevance of these models is analyzed based upon two case studies. Results show that the joint modeling approach yields accurate sensitivity index estimatiors even when heteroscedasticity is strong

    Challenges of Modeling Outcomes for Surgical Infections: A Word of Caution.

    No full text
    Background: We developed a novel analytic tool for colorectal deep organ/space surgical site infections (C-OSI) prediction utilizing both institutional and extra-institutional American College of Surgeons-National Surgical Quality Improvement Program (ACS-NSQIP) data. Methods: Elective colorectal resections (2006-2014) were included. The primary end point was C-OSI rate. A Bayesian-Probit regression model with multiple imputation (BPMI) via Dirichlet process handled missing data. The baseline model for comparison was a multivariable logistic regression model (generalized linear model; GLM) with indicator parameters for missing data and stepwise variable selection. Out-of-sample performance was evaluated with receiver operating characteristic (ROC) analysis of 10-fold cross-validated samples. Results: Among 2,376 resections, C-OSI rate was 4.6% (n = 108). The BPMI model identified (n = 57; 56% sensitivity) of these patients, when set at a threshold leading to 80% specificity (approximately a 20% false alarm rate). The BPMI model produced an area under the curve (AUC) = 0.78 via 10-fold cross- validation demonstrating high predictive accuracy. In contrast, the traditional GLM approach produced an AUC = 0.71 and a corresponding sensitivity of 0.47 at 80% specificity, both of which were statstically significant differences. In addition, when the model was built utilizing extra-institutional data via inclusion of all (non-Mayo Clinic) patients in ACS-NSQIP, C-OSI prediction was less accurate with AUC = 0.74 and sensitivity of 0.47 (i.e., a 19% relative performance decrease) when applied to patients at our institution. Conclusions: Although the statistical methodology associated with the BPMI model provides advantages over conventional handling of missing data, the tool should be built with data specific to the individual institution to optimize performance

    A Review on Global Sensitivity Analysis Methods

    No full text
    This chapter makes a review, in a complete methodological framework, of various global sensitivity analysis methods of model output. Numerous statistical and probabilistic tools (regression, smoothing, tests, statistical learning, Monte Carlo, \ldots) aim at determining the model input variables which mostly contribute to an interest quantity depending on model output. This quantity can be for instance the variance of an output variable. Three kinds of methods are distinguished: the screening (coarse sorting of the most influential inputs among a large number), the measures of importance (quantitative sensitivity indices) and the deep exploration of the model behaviour (measuring the effects of inputs on their all variation range). A progressive application methodology is illustrated on a scholar application. A synthesis is given to place every method according to several axes, mainly the cost in number of model evaluations, the model complexity and the nature of brought information
    corecore