23 research outputs found

    Mitigating Cognitive Biases in Risk Identification: Practitioner Checklist for the Aerospace Sector

    Get PDF
    This research contributes an operational checklist for mitigating cognitive biases in the aerospace sector risk management process. The Risk Identification and Evaluation Bias Reduction Checklist includes steps for grounding the risk identification and evaluation activities in past project experiences, through historical data, and the importance of incorporating multiple methods and perspectives to guard against optimism and a singular project instantiation focused view. The authors developed a survey to elicit subject matter expert (SME) judgment on the value of the checklist to support its use in government and industry as a risk management tool. The survey also provided insights on bias mitigation strategies and lessons learned. This checklist addresses the deficiency in the literature in providing operational steps for the practitioner for bias reduction in risk management in the aerospace sector

    Reliability Assessment for COTS Components in Space Flight Applications

    No full text
    Systems built for space flight applications usually demand very high degree of performance and a very high level of accuracy. Hence, the design engineers are often prone to selecting state-of-art technologies for inclusion in their system design. The shrinking budgets also necessitate use of COTS (Commercial Off-The-Shelf) components, which are construed as being less expensive. The performance and accuracy requirements for space flight applications are much more stringent than those for the commercial applications. The quantity of systems designed and developed for space applications are much lower in number than those produced for the commercial applications. With a given set of requirements, are these COTS components reliable? This paper presents a model for assessing the reliability of COTS components in space applications and the associated affect on the system reliability. We illustrate the method with a real application

    A bayesian approach to inference for monotone failure rates

    No full text
    In reliability theory, the notion of monotone failure rates plays a central role. When prior information indicates that such monotonicity is meaningful, it must be incorporated into the prior distribution whenever inference about the failure rates needs to be made. In this paper we show how this can be done in a straightforward and intuitively pleasing manner. The time interval is partitioned into subintervals of equal width and the number of failures and censoring in each interval is recorded. By defining a Dirichlet as the joint prior distribution for the forward or the backward differences of the conditional probabilities of survival in each interval, we find that the monotonicity is presenved in the posterior estimate of the failure rates. A posterior estimate of the survival function can also be obtained. We illustrate our method by applying it to some real life medical data.Bayesian nonparametric estimation increasing failure rate decreasing failure rate Dirichlet distribution

    Bayes Estimate and Inference for Entropy and Information Index of Fit

    No full text
    Kullback-Leibler information is widely used for developing indices of distributional fit. The most celebrated of such indices is Akaike’s AIC, which is derived as an estimate of the minimum Kullback-Leibler information between the unknown data-generating distribution and a parametric model. In the derivation of AIC, the entropy of the data-generating distribution is bypassed because it is free from the parameters. Consequently, the AIC type measures provide criteria for model comparison purposes only, and do not provide information diagnostic about the model fit. A nonparametric estimate of entropy of the data-generating distribution is needed for assessing the model fit. Several entropy estimates are available and have been used for frequentist inference about information fit indices. A few entropy-based fit indices have been suggested for Bayesian inference. This paper develops a class of entropy estimates and provides a procedure for Bayesian inference on the entropy and a fit index. For the continuous case, we define a quantized entropy that approximates and converges to the entropy integral. The quantized entropy includes some well known measures of sample entropy and the existing Bayes entropy estimates as its special cases. For inference about the fit, we use the candidate model as the expected distribution in the Dirichlet process prior and derive the posterior mean of the quantized entropy as the Bayes estimate. The maximum entropy characterization of the candidate model is then used to derive the prior and posterior distributions for the Kullback-Leibler information index of fit. The consistency of the proposed Bayes estimates for the entropy and for the information index are shown. As by-products, the procedure also produces priors and posteriors for the model parameters and the moments
    corecore