60,410 research outputs found
Polygon Exploration with Time-Discrete Vision
With the advent of autonomous robots with two- and three-dimensional scanning
capabilities, classical visibility-based exploration methods from computational
geometry have gained in practical importance. However, real-life laser scanning
of useful accuracy does not allow the robot to scan continuously while in
motion; instead, it has to stop each time it surveys its environment. This
requirement was studied by Fekete, Klein and Nuechter for the subproblem of
looking around a corner, but until now has not been considered in an online
setting for whole polygonal regions.
We give the first algorithmic results for this important algorithmic problem
that combines stationary art gallery-type aspects with watchman-type issues in
an online scenario: We demonstrate that even for orthoconvex polygons, a
competitive strategy can be achieved only for limited aspect ratio A (the ratio
of the maximum and minimum edge length of the polygon), i.e., for a given lower
bound on the size of an edge; we give a matching upper bound by providing an
O(log A)-competitive strategy for simple rectilinear polygons, using the
assumption that each edge of the polygon has to be fully visible from some scan
point.Comment: 28 pages, 17 figures, 2 photographs, 3 tables, Latex. Updated some
details (title, figures and text) for final journal revision, including
explicit assumption of full edge visibilit
Lower Bounds for Shoreline Searching With 2 or More Robots
Searching for a line on the plane with unit speed robots is a classic
online problem that dates back to the 50's, and for which competitive ratio
upper bounds are known for every . In this work we improve the best
lower bound known for robots from 1.5993 to 3. Moreover we prove that the
competitive ratio is at least for robots, and at least
for robots. Our lower bounds match the best upper
bounds known for , hence resolving these cases. To the best of our
knowledge, these are the first lower bounds proven for the cases of
this several decades old problem.Comment: This is an updated version of the paper with the same title which
will appear in the proceedings of the 23rd International Conference on
Principles of Distributed Systems (OPODIS 2019) Neuchatel, Switzerland, July
17-19, 201
Recommended from our members
Network-constrained models of liberalized electricity markets: the devil is in the details
Numerical models for electricity markets are frequently used to inform and support decisions. How robust are the results? Three research groups used the same, realistic data set for generators, demand and transmission network as input for their numerical models. The results coincide when predicting competitive market results. In the strategic case in which large generators can exercise market power, the predicted prices differed significantly. The results are highly sensitive to assumptions about market design, timing of the market and assumptions about constraints on the rationality of generators. Given the same assumptions the results coincide. We provide a checklist for users to understand the implications of different modelling assumptions
Batch Bayesian Optimization via Local Penalization
The popularity of Bayesian optimization methods for efficient exploration of
parameter spaces has lead to a series of papers applying Gaussian processes as
surrogates in the optimization of functions. However, most proposed approaches
only allow the exploration of the parameter space to occur sequentially. Often,
it is desirable to simultaneously propose batches of parameter values to
explore. This is particularly the case when large parallel processing
facilities are available. These facilities could be computational or physical
facets of the process being optimized. E.g. in biological experiments many
experimental set ups allow several samples to be simultaneously processed.
Batch methods, however, require modeling of the interaction between the
evaluations in the batch, which can be expensive in complex scenarios. We
investigate a simple heuristic based on an estimate of the Lipschitz constant
that captures the most important aspect of this interaction (i.e. local
repulsion) at negligible computational overhead. The resulting algorithm
compares well, in running time, with much more elaborate alternatives. The
approach assumes that the function of interest, , is a Lipschitz continuous
function. A wrap-loop around the acquisition function is used to collect
batches of points of certain size minimizing the non-parallelizable
computational effort. The speed-up of our method with respect to previous
approaches is significant in a set of computationally expensive experiments.Comment: 11 pages, 10 figure
- âŠ