2,520 research outputs found
Cluster-based Kriging approximation algorithms for complexity reduction
Kriging or Gaussian Process Regression is applied in many fields as a non-linear regression model as well as a surrogate model in the field of evolutionary computation. However, the computational and space complexity of Kriging, that is cubic and quadratic in the number of data points respectively, becomes a major bottleneck with more and more data available nowadays. In this paper, we propose a general methodology for the complexity reduction, called cluster Kriging, where the whole data set is partitioned into smaller clusters and multiple Kriging models are built on top of them. In addition, four Kriging approximation algorithms are proposed as candidate algorithms within the new framework. Each of these algorithms can be applied to much larger data sets while maintaining the advantages and power of Kriging. The proposed algorithms are explained in detail and compared empirically against a broad set of existing state-of-the-art Kriging approximation methods on a well-defined testing framework. According to the empirical study, the proposed algorithms consistently outperform the existing algorithms. Moreover, some practical suggestions are provided for using the proposed algorithms.Algorithms and the Foundations of Software technolog
Quantifying uncertainties on excursion sets under a Gaussian random field prior
We focus on the problem of estimating and quantifying uncertainties on the
excursion set of a function under a limited evaluation budget. We adopt a
Bayesian approach where the objective function is assumed to be a realization
of a Gaussian random field. In this setting, the posterior distribution on the
objective function gives rise to a posterior distribution on excursion sets.
Several approaches exist to summarize the distribution of such sets based on
random closed set theory. While the recently proposed Vorob'ev approach
exploits analytical formulae, further notions of variability require Monte
Carlo estimators relying on Gaussian random field conditional simulations. In
the present work we propose a method to choose Monte Carlo simulation points
and obtain quasi-realizations of the conditional field at fine designs through
affine predictors. The points are chosen optimally in the sense that they
minimize the posterior expected distance in measure between the excursion set
and its reconstruction. The proposed method reduces the computational costs due
to Monte Carlo simulations and enables the computation of quasi-realizations on
fine designs in large dimensions. We apply this reconstruction approach to
obtain realizations of an excursion set on a fine grid which allow us to give a
new measure of uncertainty based on the distance transform of the excursion
set. Finally we present a safety engineering test case where the simulation
method is employed to compute a Monte Carlo estimate of a contour line
Nonparametric Methods in Astronomy: Think, Regress, Observe -- Pick Any Three
Telescopes are much more expensive than astronomers, so it is essential to
minimize required sample sizes by using the most data-efficient statistical
methods possible. However, the most commonly used model-independent techniques
for finding the relationship between two variables in astronomy are flawed. In
the worst case they can lead without warning to subtly yet catastrophically
wrong results, and even in the best case they require more data than necessary.
Unfortunately, there is no single best technique for nonparametric regression.
Instead, we provide a guide for how astronomers can choose the best method for
their specific problem and provide a python library with both wrappers for the
most useful existing algorithms and implementations of two new algorithms
developed here.Comment: 19 pages, PAS
Optimisation of Mobile Communication Networks - OMCO NET
The mini conference “Optimisation of Mobile Communication Networks” focuses on advanced methods for search and optimisation applied to wireless communication networks. It is sponsored by Research & Enterprise Fund Southampton Solent University.
The conference strives to widen knowledge on advanced search methods capable of optimisation of wireless communications networks. The aim is to provide a forum for exchange of recent knowledge, new ideas and trends in this progressive and challenging area. The conference will popularise new successful approaches on resolving hard tasks such as minimisation of transmit power, cooperative and optimal routing
Understanding and Comparing Scalable Gaussian Process Regression for Big Data
As a non-parametric Bayesian model which produces informative predictive
distribution, Gaussian process (GP) has been widely used in various fields,
like regression, classification and optimization. The cubic complexity of
standard GP however leads to poor scalability, which poses challenges in the
era of big data. Hence, various scalable GPs have been developed in the
literature in order to improve the scalability while retaining desirable
prediction accuracy. This paper devotes to investigating the methodological
characteristics and performance of representative global and local scalable GPs
including sparse approximations and local aggregations from four main
perspectives: scalability, capability, controllability and robustness. The
numerical experiments on two toy examples and five real-world datasets with up
to 250K points offer the following findings. In terms of scalability, most of
the scalable GPs own a time complexity that is linear to the training size. In
terms of capability, the sparse approximations capture the long-term spatial
correlations, the local aggregations capture the local patterns but suffer from
over-fitting in some scenarios. In terms of controllability, we could improve
the performance of sparse approximations by simply increasing the inducing
size. But this is not the case for local aggregations. In terms of robustness,
local aggregations are robust to various initializations of hyperparameters due
to the local attention mechanism. Finally, we highlight that the proper hybrid
of global and local scalable GPs may be a promising way to improve both the
model capability and scalability for big data.Comment: 25 pages, 15 figures, preprint submitted to KB
Surrogate Models and Mixtures of Experts in Aerodynamic Performance Prediction for Mission Analysis
Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/140436/1/6.2014-2301.pd
- …