143,269 research outputs found

    A framework for geometric quantification and forecasting of cost uncertainty for aerospace innovations

    Get PDF
    Quantification and forecasting of cost uncertainty for aerospace innovations is challenged by conditions of small data which arises out of having few measurement points, little prior experience, unknown history, low data quality, and conditions of deep uncertainty. Literature research suggests that no frameworks exist which specifically address cost estimation under such conditions. In order to provide contemporary cost estimating techniques with an innovative perspective for addressing such challenges a framework based on the principles of spatial geometry is described. The framework consists of a method for visualising cost uncertainty and a dependency model for quantifying and forecasting cost uncertainty. Cost uncertainty is declared to represent manifested and unintended future cost variance with a probability of 100% and an unknown quantity and innovative starting conditions considered to exist when no verified and accurate cost model is available. The shape of data is used as an organising principle and the attribute of geometrical symmetry of cost variance point clouds used for the quantification of cost uncertainty. The results of the investigation suggest that the uncertainty of a cost estimate at any future point in time may be determined by the geometric symmetry of the cost variance data in its point cloud form at the time of estimation. Recommendations for future research include using the framework to determine the “most likely values” of estimates in Monte Carlo simulations and generalising the dependency model introduced. Future work is also recommended to reduce the framework limitations noted

    Uncertainty Quantification for Hyperspectral Image Denoising Frameworks based on Low-rank Matrix Approximation

    Full text link
    Sliding-window based low-rank matrix approximation (LRMA) is a technique widely used in hyperspectral images (HSIs) denoising or completion. However, the uncertainty quantification of the restored HSI has not been addressed to date. Accurate uncertainty quantification of the denoised HSI facilitates to applications such as multi-source or multi-scale data fusion, data assimilation, and product uncertainty quantification, since these applications require an accurate approach to describe the statistical distributions of the input data. Therefore, we propose a prior-free closed-form element-wise uncertainty quantification method for LRMA-based HSI restoration. Our closed-form algorithm overcomes the difficulty of the HSI patch mixing problem caused by the sliding-window strategy used in the conventional LRMA process. The proposed approach only requires the uncertainty of the observed HSI and provides the uncertainty result relatively rapidly and with similar computational complexity as the LRMA technique. We conduct extensive experiments to validate the estimation accuracy of the proposed closed-form uncertainty approach. The method is robust to at least 10% random impulse noise at the cost of 10-20% of additional processing time compared to the LRMA. The experiments indicate that the proposed closed-form uncertainty quantification method is more applicable to real-world applications than the baseline Monte Carlo test, which is computationally expensive. The code is available in the attachment and will be released after the acceptance of this paper.Comment: Accepted for publication by IEEE Transactions on Geoscience and Remote Sensing. IEEE Transactions on Geoscience and Remote Sensing (TGRS

    Efficient Uncertainty Quantification and Variance-Based Sensitivity Analysis in Epidemic Modelling Using Polynomial Chaos

    Get PDF
    The use of epidemic modelling in connection with spread of diseases plays an important role in understanding dynamics and providing forecasts for informed analysis and decision-making. In this regard, it is crucial to quantify the effects of uncertainty in the modelling and in model-based predictions to trustfully communicate results and limitations. We propose to do efficient uncertainty quantification in compartmental epidemic models using the generalized Polynomial Chaos (gPC) framework. This framework uses a suitable polynomial basis that can be tailored to the underlying distribution for the parameter uncertainty to do forward propagation through efficient sampling via a mathematical model to quantify the effect on the output. By evaluating the model in a small number of selected points, gPC provides illuminating statistics and sensitivity analysis at a low computational cost. Through two particular case studies based on Danish data for the spread of Covid-19, we demonstrate the applicability of the technique. The test cases consider epidemic peak time estimation and the dynamics between superspreading and partial lockdown measures. The computational results show the efficiency and feasibility of the uncertainty quantification techniques based on gPC, and highlight the relevance of computational uncertainty quantification in epidemic modelling.Peer reviewe

    SEEDS: Emulation of Weather Forecast Ensembles with Diffusion Models

    Full text link
    Uncertainty quantification is crucial to decision-making. A prominent example is probabilistic forecasting in numerical weather prediction. The dominant approach to representing uncertainty in weather forecasting is to generate an ensemble of forecasts. This is done by running many physics-based simulations under different conditions, which is a computationally costly process. We propose to amortize the computational cost by emulating these forecasts with deep generative diffusion models learned from historical data. The learned models are highly scalable with respect to high-performance computing accelerators and can sample hundreds to tens of thousands of realistic weather forecasts at low cost. When designed to emulate operational ensemble forecasts, the generated ones are similar to physics-based ensembles in important statistical properties and predictive skill. When designed to correct biases present in the operational forecasting system, the generated ensembles show improved probabilistic forecast metrics. They are more reliable and forecast probabilities of extreme weather events more accurately. While this work demonstrates the utility of the methodology by focusing on weather forecasting, the generative artificial intelligence methodology can be extended for uncertainty quantification in climate modeling, where we believe the generation of very large ensembles of climate projections will play an increasingly important role in climate risk assessment.Comment: fixed a mistake of the previous version; the paper has not been submitted to neurips 202

    Multifidelity Monte Carlo estimation for large-scale uncertainty propagation

    Get PDF
    One important task of uncertainty quantification is propagating input uncertainties through a system of interest to quantify the uncertainties’ effects on the system outputs; however, numerical methods for uncertainty propagation are often based on Monte Carlo estimation, which can require large numbers of numerical simulations of the numerical model describing the system response to obtain estimates with acceptable accuracies. Thus, if the model is computationally expensive to evaluate, then Monte-Carlo-based uncertainty propagation methods can quickly become computationally intractable. We demonstrate that multifidelity methods can significantly speedup uncertainty propagation by leveraging low-cost low-fidelity models and establish accuracy guarantees by using occasional recourse to the expensive high-fidelity model. We focus on the multifidelity Monte Carlo method, which is a multifidelity approach that optimally distributes work among the models such that the mean-squared error of the multifidelity estimator is minimized for a given computational budget. The multifidelity Monte Carlo method is applicable to general types of low-fidelity models, including projection-based reduced models, data-fit surrogates, response surfaces, and simplified-physics models. We apply the multifidelity Monte Carlo method to a coupled aero-structural analysis of a wing and a flutter problem with a high-aspect-ratio wing. The low-fidelity models are data-fit surrogate models derived with standard procedures that are built in common software environments such as Matlab and numpy/scipy. Our results demonstrate speedups of orders of magnitude compared to using the high-fidelity model alone.United States. Air Force. Office of Scientific Research. Multidisciplinary University Research Initiative (Award FA9550-15-1-0038
    • …
    corecore