15,641 research outputs found

    How are emergent constraints quantifying uncertainty and what do they leave behind?

    Get PDF
    The use of emergent constraints to quantify uncertainty for key policy relevant quantities such as Equilibrium Climate Sensitivity (ECS) has become increasingly widespread in recent years. Many researchers, however, claim that emergent constraints are inappropriate or even under-report uncertainty. In this paper we contribute to this discussion by examining the emergent constraints methodology in terms of its underpinning statistical assumptions. We argue that the existing frameworks are based on indefensible assumptions, then show how weakening them leads to a more transparent Bayesian framework wherein hitherto ignored sources of uncertainty, such as how reality might differ from models, can be quantified. We present a guided framework for the quantification of additional uncertainties that is linked to the confidence we can have in the underpinning physical arguments for using linear constraints. We provide a software tool for implementing our general framework for emergent constraints and use it to illustrate the framework on a number of recent emergent constraints for ECS. We find that the robustness of any constraint to additional uncertainties depends strongly on the confidence we can have in the underpinning physics, allowing a future framing of the debate over the validity of a particular constraint around the underlying physical arguments, rather than statistical assumptions

    Data Assimilation for a Geological Process Model Using the Ensemble Kalman Filter

    Full text link
    We consider the problem of conditioning a geological process-based computer simulation, which produces basin models by simulating transport and deposition of sediments, to data. Emphasising uncertainty quantification, we frame this as a Bayesian inverse problem, and propose to characterize the posterior probability distribution of the geological quantities of interest by using a variant of the ensemble Kalman filter, an estimation method which linearly and sequentially conditions realisations of the system state to data. A test case involving synthetic data is used to assess the performance of the proposed estimation method, and to compare it with similar approaches. We further apply the method to a more realistic test case, involving real well data from the Colville foreland basin, North Slope, Alaska.Comment: 34 pages, 10 figures, 4 table

    Bayesian learning of models for estimating uncertainty in alert systems: application to air traffic conflict avoidance

    Get PDF
    Alert systems detect critical events which can happen in the short term. Uncertainties in data and in the models used for detection cause alert errors. In the case of air traffic control systems such as Short-Term Conflict Alert (STCA), uncertainty increases errors in alerts of separation loss. Statistical methods that are based on analytical assumptions can provide biased estimates of uncertainties. More accurate analysis can be achieved by using Bayesian Model Averaging, which provides estimates of the posterior probability distribution of a prediction. We propose a new approach to estimate the prediction uncertainty, which is based on observations that the uncertainty can be quantified by variance of predicted outcomes. In our approach, predictions for which variances of posterior probabilities are above a given threshold are assigned to be uncertain. To verify our approach we calculate a probability of alert based on the extrapolation of closest point of approach. Using Heathrow airport flight data we found that alerts are often generated under different conditions, variations in which lead to alert detection errors. Achieving 82.1% accuracy of modelling the STCA system, which is a necessary condition for evaluating the uncertainty in prediction, we found that the proposed method is capable of reducing the uncertain component. Comparison with a bootstrap aggregation method has demonstrated a significant reduction of uncertainty in predictions. Realistic estimates of uncertainties will open up new approaches to improving the performance of alert systems

    Dropout Distillation for Efficiently Estimating Model Confidence

    Full text link
    We propose an efficient way to output better calibrated uncertainty scores from neural networks. The Distilled Dropout Network (DDN) makes standard (non-Bayesian) neural networks more introspective by adding a new training loss which prevents them from being overconfident. Our method is more efficient than Bayesian neural networks or model ensembles which, despite providing more reliable uncertainty scores, are more cumbersome to train and slower to test. We evaluate DDN on the the task of image classification on the CIFAR-10 dataset and show that our calibration results are competitive even when compared to 100 Monte Carlo samples from a dropout network while they also increase the classification accuracy. We also propose better calibration within the state of the art Faster R-CNN object detection framework and show, using the COCO dataset, that DDN helps train better calibrated object detectors

    A Bayesian framework for verification and recalibration of ensemble forecasts: How uncertain is NAO predictability?

    Get PDF
    Predictability estimates of ensemble prediction systems are uncertain due to limited numbers of past forecasts and observations. To account for such uncertainty, this paper proposes a Bayesian inferential framework that provides a simple 6-parameter representation of ensemble forecasting systems and the corresponding observations. The framework is probabilistic, and thus allows for quantifying uncertainty in predictability measures such as correlation skill and signal-to-noise ratios. It also provides a natural way to produce recalibrated probabilistic predictions from uncalibrated ensembles forecasts. The framework is used to address important questions concerning the skill of winter hindcasts of the North Atlantic Oscillation for 1992-2011 issued by the Met Office GloSea5 climate prediction system. Although there is much uncertainty in the correlation between ensemble mean and observations, there is strong evidence of skill: the 95% credible interval of the correlation coefficient of [0.19,0.68] does not overlap zero. There is also strong evidence that the forecasts are not exchangeable with the observations: With over 99% certainty, the signal-to-noise ratio of the forecasts is smaller than the signal-to-noise ratio of the observations, which suggests that raw forecasts should not be taken as representative scenarios of the observations. Forecast recalibration is thus required, which can be coherently addressed within the proposed framework.Comment: 36 pages, 10 figure
    corecore