522 research outputs found
Range Queries on Uncertain Data
Given a set of uncertain points on the real line, each represented by
its one-dimensional probability density function, we consider the problem of
building data structures on to answer range queries of the following three
types for any query interval : (1) top- query: find the point in that
lies in with the highest probability, (2) top- query: given any integer
as part of the query, return the points in that lie in
with the highest probabilities, and (3) threshold query: given any threshold
as part of the query, return all points of that lie in with
probabilities at least . We present data structures for these range
queries with linear or nearly linear space and efficient query time.Comment: 26 pages. A preliminary version of this paper appeared in ISAAC 2014.
In this full version, we also present solutions to the most general case of
the problem (i.e., the histogram bounded case), which were left as open
problems in the preliminary versio
Orthogonal Range Reporting and Rectangle Stabbing for Fat Rectangles
In this paper we study two geometric data structure problems in the special
case when input objects or queries are fat rectangles. We show that in this
case a significant improvement compared to the general case can be achieved.
We describe data structures that answer two- and three-dimensional orthogonal
range reporting queries in the case when the query range is a \emph{fat}
rectangle. Our two-dimensional data structure uses words and supports
queries in time, where is the number of points in the
data structure, is the size of the universe and is the number of points
in the query range. Our three-dimensional data structure needs
words of space and answers queries in time. We also consider the rectangle stabbing problem on a set of
three-dimensional fat rectangles. Our data structure uses space and
answers stabbing queries in time.Comment: extended version of a WADS'19 pape
Virus Propagation in Multiple Profile Networks
Suppose we have a virus or one competing idea/product that propagates over a
multiple profile (e.g., social) network. Can we predict what proportion of the
network will actually get "infected" (e.g., spread the idea or buy the
competing product), when the nodes of the network appear to have different
sensitivity based on their profile? For example, if there are two profiles
and in a network and the nodes of profile
and profile are susceptible to a highly spreading
virus with probabilities and
respectively, what percentage of both profiles will actually get infected from
the virus at the end? To reverse the question, what are the necessary
conditions so that a predefined percentage of the network is infected? We
assume that nodes of different profiles can infect one another and we prove
that under realistic conditions, apart from the weak profile (great
sensitivity), the stronger profile (low sensitivity) will get infected as well.
First, we focus on cliques with the goal to provide exact theoretical results
as well as to get some intuition as to how a virus affects such a multiple
profile network. Then, we move to the theoretical analysis of arbitrary
networks. We provide bounds on certain properties of the network based on the
probabilities of infection of each node in it when it reaches the steady state.
Finally, we provide extensive experimental results that verify our theoretical
results and at the same time provide more insight on the problem
Dynamic tree shortcut with constant degree
LNCS v.9188 entitled: Computing and Combinatorics: 21st International Conference, COCOON 2015, Beijing, China, August 4-6, 2015, ProceedingsGiven a rooted tree with n nodes, the tree shortcut problem is to add a set of shortcut edges to the tree such that the shortest path from each node to any of its ancestors is of length O(log n) and the degree increment of each node is constant. We consider in this paper the dynamic version of the problem, which supports node insertion and deletion. For insertion, a node can be inserted as a leaf node or an internal node by sub-dividing an existing edge. For deletion, a leaf node can be deleted, or an internal node can be merged with its single child. We propose an algorithm that maintains a set of shortcut edges in O(log n) time for an insertion or deletion.postprin
Query processing of spatial objects: Complexity versus Redundancy
The management of complex spatial objects in applications, such as geography and cartography,
imposes stringent new requirements on spatial database systems, in particular on efficient
query processing. As shown before, the performance of spatial query processing can be improved
by decomposing complex spatial objects into simple components. Up to now, only decomposition
techniques generating a linear number of very simple components, e.g. triangles or trapezoids, have
been considered. In this paper, we will investigate the natural trade-off between the complexity of
the components and the redundancy, i.e. the number of components, with respect to its effect on
efficient query processing. In particular, we present two new decomposition methods generating
a better balance between the complexity and the number of components than previously known
techniques. We compare these new decomposition methods to the traditional undecomposed representation
as well as to the well-known decomposition into convex polygons with respect to their
performance in spatial query processing. This comparison points out that for a wide range of query
selectivity the new decomposition techniques clearly outperform both the undecomposed representation
and the convex decomposition method. More important than the absolute gain in performance
by a factor of up to an order of magnitude is the robust performance of our new decomposition
techniques over the whole range of query selectivity
An efficient indexing scheme for multi-dimensional moving objects
We consider the problem of indexing a set of objects moving in d-dimensional space along linear trajectories. A simple disk-based indexing scheme is proposed to efficiently answer queries of the form: report all objects that will pass between two given points within a specified time interval. Our scheme is based on mapping the objects to a dual space, where queries about moving objects translate into polyhedral queries concerning their speeds and initial locations. We then present a simple method for answering such polyhedral queries, based on partitioning the space into disjoint regions and using a B-tree to index the points in each region. By appropriately selecting the boundaries of each region, we can guarantee an average search time that almost matches a known lower bound for the problem. Specifically, for a fixed d, if the coordinates of a given set of N points are statistically independent, the proposed technique answers polyhedral queries, on the average, in O((N/B)1-1/d.(logB N)1/d + K/B) I/O\u27s using O(N/B) space, where B is the block size, and K is the number of reported points. Our approach is novel in that, while it provides a theoretical upper bound on the average query time, it avoids the use of complicated data structures, making it an effective candidate for practical applications. © Springer-Verlag Berlin Heidelberg 2003
Searching edges in the overlap of two plane graphs
Consider a pair of plane straight-line graphs, whose edges are colored red
and blue, respectively, and let n be the total complexity of both graphs. We
present a O(n log n)-time O(n)-space technique to preprocess such pair of
graphs, that enables efficient searches among the red-blue intersections along
edges of one of the graphs. Our technique has a number of applications to
geometric problems. This includes: (1) a solution to the batched red-blue
search problem [Dehne et al. 2006] in O(n log n) queries to the oracle; (2) an
algorithm to compute the maximum vertical distance between a pair of 3D
polyhedral terrains one of which is convex in O(n log n) time, where n is the
total complexity of both terrains; (3) an algorithm to construct the Hausdorff
Voronoi diagram of a family of point clusters in the plane in O((n+m) log^3 n)
time and O(n+m) space, where n is the total number of points in all clusters
and m is the number of crossings between all clusters; (4) an algorithm to
construct the farthest-color Voronoi diagram of the corners of n axis-aligned
rectangles in O(n log^2 n) time; (5) an algorithm to solve the stabbing circle
problem for n parallel line segments in the plane in optimal O(n log n) time.
All these results are new or improve on the best known algorithms.Comment: 22 pages, 6 figure
Noise Thresholds for Higher Dimensional Systems using the Discrete Wigner Function
For a quantum computer acting on d-dimensional systems, we analyze the
computational power of circuits wherein stabilizer operations are perfect and
we allow access to imperfect non-stabilizer states or operations. If the noise
rate affecting the non-stabilizer resource is sufficiently high, then these
states and operations can become simulable in the sense of the Gottesman-Knill
theorem, reducing the overall power of the circuit to no better than classical.
In this paper we find the depolarizing noise rate at which this happens, and
consequently the most robust non-stabilizer states and non-Clifford gates. In
doing so, we make use of the discrete Wigner function and derive facets of the
so-called qudit Clifford polytope i.e. the inequalities defining the convex
hull of all qudit Clifford gates. Our results for robust states are provably
optimal. For robust gates we find a critical noise rate that, as dimension
increases, rapidly approaches the the theoretical optimum of 100%. Some
connections with the question of qudit magic state distillation are discussed.Comment: 14 pages, 1 table; Minor changes vs. version
Ear-clipping Based Algorithms of Generating High-quality Polygon Triangulation
A basic and an improved ear clipping based algorithm for triangulating simple
polygons and polygons with holes are presented. In the basic version, the ear
with smallest interior angle is always selected to be cut in order to create
fewer sliver triangles. To reduce sliver triangles in further, a bound of angle
is set to determine whether a newly formed triangle has sharp angles, and edge
swapping is accepted when the triangle is sharp. To apply the two algorithms on
polygons with holes, "Bridge" edges are created to transform a polygon with
holes to a degenerate polygon which can be triangulated by the two algorithms.
Applications show that the basic algorithm can avoid creating sliver triangles
and obtain better triangulations than the traditional ear clipping algorithm,
and the improved algorithm can in further reduce sliver triangles effectively.
Both of the algorithms run in O(n2) time and O(n) space.Comment: Proceedings of the 2012 International Conference on Information
Technology and Software Engineering Lecture Notes in Electrical Engineering
Volume 212, 2013, pp 979-98
On colouring point visibility graphs
In this paper we show that it can be decided in polynomial time whether or
not the visibility graph of a given point set is 4-colourable, and such a
4-colouring, if it exists, can also be constructed in polynomial time. We show
that the problem of deciding whether the visibility graph of a point set is
5-colourable, is NP-complete. We give an example of a point visibility graph
that has chromatic number 6 while its clique number is only 4
- …