2,473 research outputs found

    Constant-Factor Approximation for TSP with Disks

    Full text link
    We revisit the traveling salesman problem with neighborhoods (TSPN) and present the first constant-ratio approximation for disks in the plane: Given a set of nn disks in the plane, a TSP tour whose length is at most O(1)O(1) times the optimal can be computed in time that is polynomial in nn. Our result is the first constant-ratio approximation for a class of planar convex bodies of arbitrary size and arbitrary intersections. In order to achieve a O(1)O(1)-approximation, we reduce the traveling salesman problem with disks, up to constant factors, to a minimum weight hitting set problem in a geometric hypergraph. The connection between TSPN and hitting sets in geometric hypergraphs, established here, is likely to have future applications.Comment: 14 pages, 3 figure

    On trip planning queries in spatial databases

    Full text link
    In this paper we discuss a new type of query in Spatial Databases, called Trip Planning Query (TPQ). Given a set of points P in space, where each point belongs to a category, and given two points s and e, TPQ asks for the best trip that starts at s, passes through exactly one point from each category, and ends at e. An example of a TPQ is when a user wants to visit a set of different places and at the same time minimize the total travelling cost, e.g. what is the shortest travelling plan for me to visit an automobile shop, a CVS pharmacy outlet, and a Best Buy shop along my trip from A to B? The trip planning query is an extension of the well-known TSP problem and therefore is NP-hard. The difficulty of this query lies in the existence of multiple choices for each category. In this paper, we first study fast approximation algorithms for the trip planning query in a metric space, assuming that the data set fits in main memory, and give the theory analysis of their approximation bounds. Then, the trip planning query is examined for data sets that do not fit in main memory and must be stored on disk. For the disk-resident data, we consider two cases. In one case, we assume that the points are located in Euclidean space and indexed with an Rtree. In the other case, we consider the problem of points that lie on the edges of a spatial network (e.g. road network) and the distance between two points is defined using the shortest distance over the network. Finally, we give an experimental evaluation of the proposed algorithms using synthetic data sets generated on real road networks

    On trip planning queries in spatial databases

    Full text link
    In this paper we discuss a new type of query in Spatial Databases, called Trip Planning Query (TPQ). Given a set of points P in space, where each point belongs to a category, and given two points s and e, TPQ asks for the best trip that starts at s, passes through exactly one point from each category, and ends at e. An example of a TPQ is when a user wants to visit a set of different places and at the same time minimize the total travelling cost, e.g. what is the shortest travelling plan for me to visit an automobile shop, a CVS pharmacy outlet, and a Best Buy shop along my trip from A to B? The trip planning query is an extension of the well-known TSP problem and therefore is NP-hard. The difficulty of this query lies in the existence of multiple choices for each category. In this paper, we first study fast approximation algorithms for the trip planning query in a metric space, assuming that the data set fits in main memory, and give the theory analysis of their approximation bounds. Then, the trip planning query is examined for data sets that do not fit in main memory and must be stored on disk. For the disk-resident data, we consider two cases. In one case, we assume that the points are located in Euclidean space and indexed with an Rtree. In the other case, we consider the problem of points that lie on the edges of a spatial network (e.g. road network) and the distance between two points is defined using the shortest distance over the network. Finally, we give an experimental evaluation of the proposed algorithms using synthetic data sets generated on real road networks

    Pants Decomposition of the Punctured Plane

    Full text link
    A pants decomposition of an orientable surface S is a collection of simple cycles that partition S into pants, i.e., surfaces of genus zero with three boundary cycles. Given a set P of n points in the plane, we consider the problem of computing a pants decomposition of the surface S which is the plane minus P, of minimum total length. We give a polynomial-time approximation scheme using Mitchell's guillotine rectilinear subdivisions. We give a quartic-time algorithm to compute the shortest pants decomposition of S when the cycles are restricted to be axis-aligned boxes, and a quadratic-time algorithm when all the points lie on a line; both exact algorithms use dynamic programming with Yao's speedup.Comment: 5 pages, 1 grayscale figur

    Certified Computation from Unreliable Datasets

    Full text link
    A wide range of learning tasks require human input in labeling massive data. The collected data though are usually low quality and contain inaccuracies and errors. As a result, modern science and business face the problem of learning from unreliable data sets. In this work, we provide a generic approach that is based on \textit{verification} of only few records of the data set to guarantee high quality learning outcomes for various optimization objectives. Our method, identifies small sets of critical records and verifies their validity. We show that many problems only need poly(1/ε)\text{poly}(1/\varepsilon) verifications, to ensure that the output of the computation is at most a factor of (1±ε)(1 \pm \varepsilon) away from the truth. For any given instance, we provide an \textit{instance optimal} solution that verifies the minimum possible number of records to approximately certify correctness. Then using this instance optimal formulation of the problem we prove our main result: "every function that satisfies some Lipschitz continuity condition can be certified with a small number of verifications". We show that the required Lipschitz continuity condition is satisfied even by some NP-complete problems, which illustrates the generality and importance of this theorem. In case this certification step fails, an invalid record will be identified. Removing these records and repeating until success, guarantees that the result will be accurate and will depend only on the verified records. Surprisingly, as we show, for several computation tasks more efficient methods are possible. These methods always guarantee that the produced result is not affected by the invalid records, since any invalid record that affects the output will be detected and verified
    • …
    corecore