1,934,893 research outputs found

    Model parameter estimation and uncertainty analysis: a report of the ISPOR-SMDM modeling good research practices task force working group - 6

    Get PDF
    A model’s purpose is to inform medical decisions and health care resource allocation. Modelers employ quantitative methods to structure the clinical, epidemiological, and economic evidence base and gain qualitative insight to assist decision makers in making better decisions. From a policy perspective, the value of a model-based analysis lies not simply in its ability to generate a precise point estimate for a specific outcome but also in the systematic examination and responsible reporting of uncertainty surrounding this outcome and the ultimate decision being addressed. Different concepts relating to uncertainty in decision modeling are explored. Stochastic (first-order) uncertainty is distinguished from both parameter (second-order) uncertainty and from heterogeneity, with structural uncertainty relating to the model itself forming another level of uncertainty to consider. The article argues that the estimation of point estimates and uncertainty in parameters is part of a single process and explores the link between parameter uncertainty through to decision uncertainty and the relationship to value-of-information analysis. The article also makes extensive recommendations around the reporting of uncertainty, both in terms of deterministic sensitivity analysis techniques and probabilistic methods. Expected value of perfect information is argued to be the most appropriate presentational technique, alongside cost-effectiveness acceptability curves, for representing decision uncertainty from probabilistic analysis

    Evaluation of touch trigger probe measurement uncertainty using FEA

    Get PDF
    Evaluation of measurement uncertainty is an essential subject in dimensional measurement. It has also become a dominant issue in coordinate measuring machine (CMM) even though its machine performance has been well accepted by many users. CMM probes, especially touch trigger probes which are commonly used, have been acknowledged as a key error source, largely due to pre-travel variations. The probe errors result in large measurement uncertainty in CMM measurement. Various methods have been introduced to estimate measurement uncertainty, but they tend to be time consuming and necessarily require a large amount of experimental data for analyzing the uncertainty. This paper presents the method of evaluation of CMM probe uncertainty using FEA modeling. It is started with the investigation of the behavior of probe by recording stylus displacement with vary triggering force. Then, those displacement results will be analyzed with sensitivity analysis technique to estimate the uncertainty of recorded results

    Uncertainty-Aware Principal Component Analysis

    Full text link
    We present a technique to perform dimensionality reduction on data that is subject to uncertainty. Our method is a generalization of traditional principal component analysis (PCA) to multivariate probability distributions. In comparison to non-linear methods, linear dimensionality reduction techniques have the advantage that the characteristics of such probability distributions remain intact after projection. We derive a representation of the PCA sample covariance matrix that respects potential uncertainty in each of the inputs, building the mathematical foundation of our new method: uncertainty-aware PCA. In addition to the accuracy and performance gained by our approach over sampling-based strategies, our formulation allows us to perform sensitivity analysis with regard to the uncertainty in the data. For this, we propose factor traces as a novel visualization that enables to better understand the influence of uncertainty on the chosen principal components. We provide multiple examples of our technique using real-world datasets. As a special case, we show how to propagate multivariate normal distributions through PCA in closed form. Furthermore, we discuss extensions and limitations of our approach

    Methods to Determine Node Centrality and Clustering in Graphs with Uncertain Structure

    Full text link
    Much of the past work in network analysis has focused on analyzing discrete graphs, where binary edges represent the "presence" or "absence" of a relationship. Since traditional network measures (e.g., betweenness centrality) utilize a discrete link structure, complex systems must be transformed to this representation in order to investigate network properties. However, in many domains there may be uncertainty about the relationship structure and any uncertainty information would be lost in translation to a discrete representation. Uncertainty may arise in domains where there is moderating link information that cannot be easily observed, i.e., links become inactive over time but may not be dropped or observed links may not always corresponds to a valid relationship. In order to represent and reason with these types of uncertainty, we move beyond the discrete graph framework and develop social network measures based on a probabilistic graph representation. More specifically, we develop measures of path length, betweenness centrality, and clustering coefficient---one set based on sampling and one based on probabilistic paths. We evaluate our methods on three real-world networks from Enron, Facebook, and DBLP, showing that our proposed methods more accurately capture salient effects without being susceptible to local noise, and that the resulting analysis produces a better understanding of the graph structure and the uncertainty resulting from its change over time.Comment: Longer version of paper appearing in Fifth International AAAI Conference on Weblogs and Social Media. 9 pages, 4 Figure

    Accounting for Calibration Uncertainties in X-ray Analysis: Effective Areas in Spectral Fitting

    Full text link
    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can be applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.Comment: 61 pages double spaced, 8 figures, accepted for publication in Ap

    How Inflation Targeters (Can) Deal with Uncertainty

    Get PDF
    The paper argues that a well-designed methodology for dealing with uncertainty improves the quality of interest-rate decisions taken by inflation targeters. A well-planned methodology is also more easily communicated to the general public, and the subsequent greater transparency makes inflation targeting more efficient. Therefore, it is relevant for an inflation targeter to consult with or consider information from other inflation targeters, researchers, and relevant decision makers when designing or improving upon their methodology. The paper also summarizes the results of a recent survey on methods for dealing with uncertainty for inflation targeters. The results are presented in a framework designed in line with decision analysis. The paper summarizes which methods are commonly used by inflation targeters and what lessons can be learnt from economic research and from decision makers.inflation targeting, uncertainty, decision analysis, robustness analysis
    corecore