143 research outputs found

    The probability of rapid climate change

    Get PDF
    If you look at a map of the air temperature of the surface of the Earth, you will see that North West Europe, including the UK, is warmer than Alaska, which is at the same latitude but on the Pacific rather than Atlantic Ocean. At school you were probably told that this was because of the Gulf Stream. However, there is a very similar current in the Pacific—the Kuroshio—which takes warm water north past Japan and then out into the Atlantic. Peter Challenor asks: What is the unique feature of the Atlantic that keeps us warm and could it change in the next few years?<br/

    Relationship between remotely-sensed signatures of the ocean and subsurface structure

    Full text link
    This is the first progress report under a Joint Research Council/Ministry of Defence research project on the relationship between remotely-sensed ocean signatures and subsurface structure and dynamics. It comprises an inventory of datasets and progress reports on sub-projects utilising in situ, altimetric and infrared data

    Towards the validation of a traceable climate model hierarchies

    Get PDF
    This is the final version of the article. Available from OUP via the DOI in this recordBackground It is a common practice to use a simple model to explain the mechanisms or processes that occur in a much more complex, complete and computationally expensive model. Many such examples can be found in climate change research. Objective This paper uses two illustrative examples to show how we can quantitatively relate the mechanisms or processes observed in a simple climate model to similar mechanisms in a more complex one. Method A simple model can only explain a more complex solution’s mechanisms if outcomes are tested over a broad range of inputs. By carefully sampling the full set of inputs for both the simple and complex models, we can robustly compare the process or mechanistic outcomes, statistically, between them. Thus, by examining the similarity or differences in the relationship between the inputs and outputs. The method can reject an incorrect simple model. Results The examples are, first, analytic and numerical solutions to the heat equation and, second, the 1948 Stommel model of horizontal ocean circulation and a more complex quasi-geostrophic ocean model. We quantitatively state how similar the simple model’s mechanisms are to the mechanisms in the more complex representation. In addition, when a simple solution may be correct, we give the percentage of the variance of the complex model’s outcomes that is explained by the simple response along with an uncertainty estimate. Conclusion We successfully tested a methodology for robustly quantifying how the physics encapsulated by a simple model of a process may exhibit itself in another, more complex formulation. Suggestions are given as a guide for use of the methodology with more complex and realistic models.Funding: NSF (0851065)

    Emulating computer models with step-discontinuous outputs using Gaussian processes

    Get PDF
    In many real-world applications, we are interested in approximating functions that are analytically unknown. An emulator provides a "fast" approximation of such functions relying on a limited number of evaluations. Gaussian processes (GPs) are commonplace emulators due to their properties such as the ability to quantify uncertainty. GPs are essentially developed to emulate smooth, continuous functions. However, the assumptions of continuity and smoothness is unwarranted in many situations. For example, in computer models where bifurcation, tipping points occur in their systems of equations, the outputs can be discontinuous. This paper examines the capacity of GPs for emulating step-discontinuous functions using two approaches. The first approach is based on choosing covariance functions/kernels, namely neural network and Gibbs, that are most appropriate for modelling discontinuities. The predictive performance of these two kernels is illustrated using several examples. The results show that they have superior performance to standard covariance functions, such as the Mat\'ern family, in capturing sharp jumps. The second approach is to transform the input space such that in the new space a GP with a standard kernel is able to predict the function well. A parametric transformation function is used whose parameters are estimated by maximum likelihood.Engineering and Physical Sciences Research Council (EPSRC

    Statistical aspects of the design of fixed structures: report to SIPM

    Full text link

    Comparison of surrogate-based uncertainty quantification methods for computationally expensive simulators

    Get PDF
    This version: arXiv:1511.00926v4 [math.ST] Available from ArXiv.org via the link in this record.Polynomial chaos and Gaussian process emulation are methods for surrogate-based uncertainty quantification, and have been developed independently in their respective communities over the last 25 years. Despite tackling similar problems in the field, to our knowledge there has yet to be a critical comparison of the two approaches in the literature. We begin by providing a detailed description of polynomial chaos and Gaussian process approaches for building a surrogate model of a black-box function. The accuracy of each surrogate method is then tested and compared for two simulators used in industry: a land-surface model (adJULES) and a launch vehicle controller (VEGACONTROL). We analyse surrogates built on experimental designs of various size and type to investigate their performance in a range of modelling scenarios. Specifically, polynomial chaos and Gaussian process surrogates are built on Sobol sequence and tensor grid designs. Their accuracy is measured by their ability to estimate the mean, standard deviation, exceedance probabilities and probability density function of the simulator output, as well as a root mean square error metric, based on an independent validation design. We find that one method does not unanimously outperform the other, but advantages can be gained in some cases, such that the preferred method depends on the modelling goals of the practitioner. Our conclusions are likely to depend somewhat on the modelling choices for the surrogates as well as the design strategy. We hope that this work will spark future comparisons of the two methods in their more advanced formulations and for different sampling strategies

    Modeling Envisat RA-2 waveforms in the coastal zone: case-study of calm water contamination

    Get PDF
    Radar altimeters have so far had limited use in the coastal zone, the area with most societal impact. This is due to both lack of, or insufficient accuracy in the necessary corrections, and more complicated altimeter signals. This paper examines waveform data from the Envisat RA-2 as it passes regularly over Pianosa (a 10 km2 island in the NW Mediterranean). Forty-six repeat passes were analysed, with most showing a reduction in signal upon passing over the island, with weak early returns corresponding to the reflections from land. Intriguingly one third of cases showed an anomalously bright hyperbolic feature. This feature may be due to extremely calm waters in the Golfo della Botte (northern side of the island), but the cause of its intermittency is not clear. The modelling of waveforms in such a complex land/sea environment demonstrates the potential for sea surface height retrievals much closer to the coast than is achieved by routine processing. The long-term development of altimetric records in the coastal zone will not only improve the calibration of altimetric data with coastal tide gauges, but also greatly enhance the study of storm surges and other coastal phenomena

    On the reliability of the Autosub autonomous underwater vehicle

    Get PDF
    As autonomous underwater vehicles (AUVs) enter operational service an assessment of their reliability is timely. Using the Autosub AUV as an example, several design issues affecting reliability are discussed, followed by an analysis of recorded faults. Perhaps contrary to expectations, failures rarely involved the autonomous nature of the vehicle. Rather, faults were typical of those that occur with any complex item of marine electromechanical equipment. A statistical analysis showed that the failure rate decreased with distance travelled- an indicator that an AUV underway, submerged, is at less risk of a fault developing than during other phases of a mission. 1

    Correcting a bias in a climate model with an augmented emulator

    Get PDF
    This is the final version. Available from Copernicus Publications via the DOI in this record. A key challenge in developing flagship climate model configurations is the process of setting uncertain input parameters at values that lead to credible climate simulations. Setting these parameters traditionally relies heavily on insights from those involved in parameterisation of the underlying climate processes. Given the many degrees of freedom and computational expense involved in evaluating such a selection, this can be imperfect leaving open questions about whether any subsequent simulated biases result from mis-set parameters or wider structural model errors (such as missing or partially parameterised processes). Here, we present a complementary approach to identifying plausible climate model parameters, with a method of bias correcting subcomponents of a climate model using a Gaussian process emulator that allows credible values of model input parameters to be found even in the presence of a significant model bias. A previous study (McNeall et al., 2016) found that a climate model had to be run using land surface input parameter values from very different, almost non-overlapping, parts of parameter space to satisfactorily simulate the Amazon and other forests respectively. As the forest fraction of modelled non-Amazon forests was broadly correct at the default parameter settings and the Amazon too low, that study suggested that the problem most likely lay in the model's treatment of non-plant processes in the Amazon region. This might be due to modelling errors such as missing deep rooting in the Amazon in the land surface component of the climate model, to a warm-dry bias in the Amazon climate of the model or a combination of both. In this study, we bias correct the climate of the Amazon in the climate model from McNeall et al. (2016) using an "augmented" Gaussian process emulator, where temperature and precipitation, variables usually regarded as model outputs, are treated as model inputs alongside land surface input parameters. A sensitivity analysis finds that the forest fraction is nearly as sensitive to climate variables as it is to changes in its land surface parameter values. Bias correcting the climate in the Amazon region using the emulator corrects the forest fraction to tolerable levels in the Amazon at many candidates for land surface input parameter values, including the default ones, and increases the valid input space shared with the other forests. We need not invoke a structural model error in the land surface model, beyond having too dry and hot a climate in the Amazon region. The augmented emulator allows bias correction of an ensemble of climate model runs and reduces the risk of choosing poor parameter values because of an error in a subcomponent of the model. We discuss the potential of the augmented emulator to act as a translational layer between model subcomponents, simplifying the process of model tuning when there are compensating errors and helping model developers discover and prioritise model errors to target.Alan Turing Institut

    Understanding uncertainty in a swan wave model using a Bayesian Emulator

    Get PDF
    This is the author accepted manuscript. The final version is available from EWTEC via the link in this recordNumerical simulation is used widely in the marine renewable energy sector. Wave and flow models are used to understand and predict the conditions experienced at offshore energy sites. Like all numerical simulations, wave models have uncertainties in their output caused by uncertainty about the various input data (which may themselves be model outputs), and uncertainty about how well the model simulates the real world. Understanding these uncertainties is important in order to hold confidence in the models accuracy. Classical Monte Carlo uncertainty analysis requires a large number of model runs which is impossible in large complex models if the computational run time is more than a few seconds. By substituting a much more computationally efficient mathematical model, known as an emulator, for the complex simulation then processing time can be decreased to a level where uncertainty analysis can be undertaken. A simple’toy’ wave model has been produced using SWAN. By using a Bayesian methodology on output from a small number of correctly designed model runs, a mathematical emulator is constructed to provide a statistical approximation of output from the model. Importantly this emulator provides not just an approximation of the output but a full probability distribution describing how close the emulator output is to the model. As this emulator provides results in a fraction of a second (compared to several seconds for the toy simulator and considerably longer for actual wave models) it can be run many thousands of times as is required for a Monte Carlo analysis. This paper describes the methodology used to construct an emulator of a simulation and provides method and results using the emulator to undertake uncertainty quantification. The methods described here can be scaled up and employed on large wave models, flow models or any deterministic numerical simulator.European Regional Development Fund (ERDF
    • …
    corecore