28,588 research outputs found

    Uncertainty Propagation and Feature Selection for Loss Estimation in Performance-based Earthquake Engineering

    Get PDF
    This report presents a new methodology, called moment matching, of propagating the uncertainties in estimating repair costs of a building due to future earthquake excitation, which is required, for example, when assessing a design in performance-based earthquake engineering. Besides excitation uncertainties, other uncertain model variables are considered, including uncertainties in the structural model parameters and in the capacity and repair costs of structural and non-structural components. Using the first few moments of these uncertain variables, moment matching requires only a few well-chosen point estimates to propagate the uncertainties to estimate the first few moments of the repair costs with high accuracy. Furthermore, the use of moment matching to estimate the exceedance probability of the repair costs is also addressed. These examples illustrate that the moment-matching approach is quite general; for example, it can be applied to any decision variable in performance-based earthquake engineering. Two buildings are chosen as illustrative examples to demonstrate the use of moment matching, a hypothetical three-story shear building and a real seven-story hotel building. For these two examples, the assembly-based vulnerability approach is employed when calculating repair costs. It is shown that the moment-matching technique is much more accurate than the well-known First-Order-Second-Moment approach when propagating the first two moments, while the resulting computational cost is of the same order. The repair-cost moments and exceedance probability estimated by the moment-matching technique are also compared with those by Monte Carlo simulation. It is concluded that as long as the order of the moment matching is sufficient, the comparison is satisfactory. Furthermore, the amount of computation for moment matching scales only linearly with the number of uncertain input variables. Last but not least, a procedure for feature selection is presented and illustrated for the second example. The conclusion is that the most important uncertain input variables among the many influencing the uncertainty in future repair costs are, in order of importance, ground-motion spectral acceleration, component capacity, ground-motion details and unit repair costs

    Revealing Relationships among Relevant Climate Variables with Information Theory

    Full text link
    A primary objective of the NASA Earth-Sun Exploration Technology Office is to understand the observed Earth climate variability, thus enabling the determination and prediction of the climate's response to both natural and human-induced forcing. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, a variety of information-theoretic quantities such as mutual information, which can be used to identify relationships among climate variables, and transfer entropy, which indicates the possibility of causal interactions. Our tools estimate these quantities along with their associated error bars, the latter of which is critical for describing the degree of uncertainty in the estimates. This work is based upon optimal binning techniques that we have developed for piecewise-constant, histogram-style models of the underlying density functions. Two useful side benefits have already been discovered. The first allows a researcher to determine whether there exist sufficient data to estimate the underlying probability density. The second permits one to determine an acceptable degree of round-off when compressing data for efficient transfer and storage. We also demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.Comment: 14 pages, 5 figures, Proceedings of the Earth-Sun System Technology Conference (ESTC 2005), Adelphi, M

    Inflation uncertainty revisited: A proposal for robust measurement

    Get PDF
    Any measure of unobserved inflation uncertainty relies on specific assumptions which are most likely not fulfilled completely. This calls into question whether an individual measure delivers a reliable signal. To reduce idiosyncratic measurement error, we propose using common information contained in different measures derived from survey data, a variety of forecast models, and volatility models. We show that all measures are driven by a common component which constitutes an indicator for inflation uncertainty. Moreover, the idiosyncratic component of survey disagreement contains systematic measurement error during economic downturns. Finally, we study the Friedman-Ball hypothesis. Using the indicator, it turns out that higher inflation is followed by higher uncertainty. By contrast, we obtain contradictory results for the individual measures. We also document that, after an inflationary shock, uncertainty decreases in the first two months which is traceable to the energy component in CPI inflation.Inflation uncertainty, inflation, survey data, stochastic volatility, GARCH, principal component analysis

    Reduced perplexity: Uncertainty measures without entropy

    Full text link
    Conference paper presented at Recent Advances in Info-Metrics, Washington, DC, 2014. Under review for a book chapter in "Recent innovations in info-metrics: a cross-disciplinary perspective on information and information processing" by Oxford University Press.A simple, intuitive approach to the assessment of probabilistic inferences is introduced. The Shannon information metrics are translated to the probability domain. The translation shows that the negative logarithmic score and the geometric mean are equivalent measures of the accuracy of a probabilistic inference. Thus there is both a quantitative reduction in perplexity as good inference algorithms reduce the uncertainty and a qualitative reduction due to the increased clarity between the original set of inferences and their average, the geometric mean. Further insight is provided by showing that the Renyi and Tsallis entropy functions translated to the probability domain are both the weighted generalized mean of the distribution. The generalized mean of probabilistic inferences forms a Risk Profile of the performance. The arithmetic mean is used to measure the decisiveness, while the -2/3 mean is used to measure the robustness
    corecore