176,935 research outputs found
A QPTAS for the Base of the Number of Triangulations of a Planar Point Set
The number of triangulations of a planar n point set is known to be ,
where the base lies between and The fastest known algorithm
for counting triangulations of a planar n point set runs in time.
The fastest known arbitrarily close approximation algorithm for the base of the
number of triangulations of a planar n point set runs in time subexponential in
We present the first quasi-polynomial approximation scheme for the base of
the number of triangulations of a planar point set
Estimating the Expected Value of Partial Perfect Information in Health Economic Evaluations using Integrated Nested Laplace Approximation
The Expected Value of Perfect Partial Information (EVPPI) is a
decision-theoretic measure of the "cost" of parametric uncertainty in decision
making used principally in health economic decision making. Despite this
decision-theoretic grounding, the uptake of EVPPI calculations in practice has
been slow. This is in part due to the prohibitive computational time required
to estimate the EVPPI via Monte Carlo simulations. However, recent developments
have demonstrated that the EVPPI can be estimated by non-parametric regression
methods, which have significantly decreased the computation time required to
approximate the EVPPI. Under certain circumstances, high-dimensional Gaussian
Process regression is suggested, but this can still be prohibitively expensive.
Applying fast computation methods developed in spatial statistics using
Integrated Nested Laplace Approximations (INLA) and projecting from a
high-dimensional into a low-dimensional input space allows us to decrease the
computation time for fitting these high-dimensional Gaussian Processes, often
substantially. We demonstrate that the EVPPI calculated using our method for
Gaussian Process regression is in line with the standard Gaussian Process
regression method and that despite the apparent methodological complexity of
this new method, R functions are available in the package BCEA to implement it
simply and efficiently
Faster SDP hierarchy solvers for local rounding algorithms
Convex relaxations based on different hierarchies of linear/semi-definite
programs have been used recently to devise approximation algorithms for various
optimization problems. The approximation guarantee of these algorithms improves
with the number of {\em rounds} in the hierarchy, though the complexity of
solving (or even writing down the solution for) the 'th level program grows
as where is the input size.
In this work, we observe that many of these algorithms are based on {\em
local} rounding procedures that only use a small part of the SDP solution (of
size instead of ). We give an algorithm to
find the requisite portion in time polynomial in its size. The challenge in
achieving this is that the required portion of the solution is not fixed a
priori but depends on other parts of the solution, sometimes in a complicated
iterative manner.
Our solver leads to time algorithms to obtain the same
guarantees in many cases as the earlier time algorithms based on
rounds of the Lasserre hierarchy. In particular, guarantees based on rounds can be realized in polynomial time.
We develop and describe our algorithm in a fairly general abstract framework.
The main technical tool in our work, which might be of independent interest in
convex optimization, is an efficient ellipsoid algorithm based separation
oracle for convex programs that can output a {\em certificate of infeasibility
with restricted support}. This is used in a recursive manner to find a sequence
of consistent points in nested convex bodies that "fools" local rounding
algorithms.Comment: 30 pages, 8 figure
Efficient computation of partition of unity interpolants through a block-based searching technique
In this paper we propose a new efficient interpolation tool, extremely
suitable for large scattered data sets. The partition of unity method is used
and performed by blending Radial Basis Functions (RBFs) as local approximants
and using locally supported weight functions. In particular we present a new
space-partitioning data structure based on a partition of the underlying
generic domain in blocks. This approach allows us to examine only a reduced
number of blocks in the search process of the nearest neighbour points, leading
to an optimized searching routine. Complexity analysis and numerical
experiments in two- and three-dimensional interpolation support our findings.
Some applications to geometric modelling are also considered. Moreover, the
associated software package written in \textsc{Matlab} is here discussed and
made available to the scientific community
Ordered Preference Elicitation Strategies for Supporting Multi-Objective Decision Making
In multi-objective decision planning and learning, much attention is paid to
producing optimal solution sets that contain an optimal policy for every
possible user preference profile. We argue that the step that follows, i.e,
determining which policy to execute by maximising the user's intrinsic utility
function over this (possibly infinite) set, is under-studied. This paper aims
to fill this gap. We build on previous work on Gaussian processes and pairwise
comparisons for preference modelling, extend it to the multi-objective decision
support scenario, and propose new ordered preference elicitation strategies
based on ranking and clustering. Our main contribution is an in-depth
evaluation of these strategies using computer and human-based experiments. We
show that our proposed elicitation strategies outperform the currently used
pairwise methods, and found that users prefer ranking most. Our experiments
further show that utilising monotonicity information in GPs by using a linear
prior mean at the start and virtual comparisons to the nadir and ideal points,
increases performance. We demonstrate our decision support framework in a
real-world study on traffic regulation, conducted with the city of Amsterdam.Comment: AAMAS 2018, Source code at
https://github.com/lmzintgraf/gp_pref_elici
- …