9 research outputs found
Heteroskedastic Gaussian processes for simulation experiments
International audienceAn increasing number of time-consuming simulators exhibit a complex noise structure that depends on the inputs. To conduct studies with limited budgets of evaluations, new surrogate methods are required to model simultaneously the mean and variance fields. To this end, we present recent advances in Gaussian process modeling with input-dependent noise. First, we describe a simple, yet efficient, joint modeling framework that rely on replication for both speed and accuracy. Then we tackle the issue of leveraging replication and exploration in a sequential manner for various goals, such as obtaining a globally accurate model, for optimization, contour finding, and active subspace estimation. We illustrate these on applications coming from epidemiology and inventory management
Recommended from our members
Sequential Design for Gaussian Process Surrogates in Noisy Level Set Estimation
We consider the problem of learning the level set for which a noisy black-box function exceeds a given threshold. To efficiently reconstruct the level set, we investigate Gaussian process (GP) metamodels and sequential design frameworks. Our focus is on strongly stochastic samplers, in particular with heavy-tailed simulation noise and low signal-to-noise ratio. We introduce the use of four GP-based metamodels in level set estimation that are robust to noise misspecification, and evaluate the performance of them. In conjunction with these metamodels, we develop several acquisition functions for guiding the sequential experimental designs, extending existing stepwise uncertainty reduction criteria to the stochastic contour-finding context. This also motivates our development of (approximate) updating formulas to efficiently compute such acquisition functions for the proposed metamodels. To expedite sequential design in stochastic experiments, we also develop adaptive batching designs, which are natural extensions of sequential design heuristics with the benefit of replication growing as response features are learned, inputs concentrate, and the metamodeling overhead rises. We develop four novel schemes that simultaneously or sequentially determine the sequential design inputs and the respective number of replicates. Our schemes are benchmarked by using synthetic examples and an application in quantitative finance (Bermudan option pricing)
Future proofing a building design using history matching inspired levelâset techniques
This is the final version. Available on open access from Wiley via the DOI in this record.âŻHow can one design a building that will be sufficiently protected against overheating and sufficiently energy efficient, whilst considering the expected increases in temperature due to climate change? We successfully manage to address this questionâgreatly reducing a large set of initial candidate building designs down to a small set of acceptable buildings. We do this using a complex computer model, statistical models of said computer model (emulators), and a modification to the history matching calibration technique. This modification tackles the problem of levelâset estimation (rather than calibration), where the goal is to find input settings which lead to the simulated output being below some threshold. The entire procedure allows us to present a practitioner with a set of acceptable building designs, with the final design chosen based on other requirements (subjective or otherwise).Engineering and Physical Sciences Research Council (EPSRC
Adaptive Batching for Gaussian Process Surrogates with Application in Noisy Level Set Estimation
We develop adaptive replicated designs for Gaussian process metamodels of
stochastic experiments. Adaptive batching is a natural extension of sequential
design heuristics with the benefit of replication growing as response features
are learned, inputs concentrate, and the metamodeling overhead rises. Motivated
by the problem of learning the level set of the mean simulator response we
develop four novel schemes: Multi-Level Batching (MLB), Ratchet Batching (RB),
Adaptive Batched Stepwise Uncertainty Reduction (ABSUR), Adaptive Design with
Stepwise Allocation (ADSA) and Deterministic Design with Stepwise Allocation
(DDSA). Our algorithms simultaneously (MLB, RB and ABSUR) or sequentially (ADSA
and DDSA) determine the sequential design inputs and the respective number of
replicates. Illustrations using synthetic examples and an application in
quantitative finance (Bermudan option pricing via Regression Monte Carlo) show
that adaptive batching brings significant computational speed-ups with minimal
loss of modeling fidelity.Comment: 36 pages, 6 figure
Evaluating Gaussian Process Metamodels and Sequential Designs for Noisy Level Set Estimation
8 figures. Major update compared to v1 including multiple new sections and new plots. All Tables have been re-doneWe consider the problem of learning the level set for which a noisy black-box function exceeds a given threshold. To efficiently reconstruct the level set, we investigate Gaussian process (GP) metamodels. Our focus is on strongly stochastic samplers, in particular with heavy-tailed simulation noise and low signal-to-noise ratio. To guard against noise misspecification, we assess the performance of three variants: (i) GPs with Student- observations; (ii) Student- processes (TPs); and (iii) classification GPs modeling the sign of the response. In conjunction with these metamodels, we analyze several acquisition functions for guiding the sequential experimental designs, extending existing stepwise uncertainty reduction criteria to the stochastic contour-finding context. This also motivates our development of (approximate) updating formulas to efficiently compute such acquisition functions. Our schemes are benchmarked by using a variety of synthetic experiments in 1--6 dimensions. We also consider an application of level set estimation for determining the optimal exercise policy of Bermudan options in finance
Evaluating Gaussian Process Metamodels and Sequential Designs for Noisy Level Set Estimation
8 figures. Major update compared to v1 including multiple new sections and new plots. All Tables have been re-doneInternational audienceWe consider the problem of learning the level set for which a noisy black-box function exceeds a given threshold. To efficiently reconstruct the level set, we investigate Gaussian process (GP) metamodels. Our focus is on strongly stochastic samplers, in particular with heavy-tailed simulation noise and low signal-to-noise ratio. To guard against noise misspecification, we assess the performance of three variants: (i) GPs with Student- observations; (ii) Student- processes (TPs); and (iii) classification GPs modeling the sign of the response. In conjunction with these metamodels, we analyze several acquisition functions for guiding the sequential experimental designs, extending existing stepwise uncertainty reduction criteria to the stochastic contour-finding context. This also motivates our development of (approximate) updating formulas to efficiently compute such acquisition functions. Our schemes are benchmarked by using a variety of synthetic experiments in 1--6 dimensions. We also consider an application of level set estimation for determining the optimal exercise policy of Bermudan options in finance
Recommended from our members