23,149 research outputs found

    Applications of incidence bounds in point covering problems

    Get PDF
    In the Line Cover problem a set of n points is given and the task is to cover the points using either the minimum number of lines or at most k lines. In Curve Cover, a generalization of Line Cover, the task is to cover the points using curves with d degrees of freedom. Another generalization is the Hyperplane Cover problem where points in d-dimensional space are to be covered by hyperplanes. All these problems have kernels of polynomial size, where the parameter is the minimum number of lines, curves, or hyperplanes needed. First we give a non-parameterized algorithm for both problems in O*(2^n) (where the O*(.) notation hides polynomial factors of n) time and polynomial space, beating a previous exponential-space result. Combining this with incidence bounds similar to the famous Szemeredi-Trotter bound, we present a Curve Cover algorithm with running time O*((Ck/log k)^((d-1)k)), where C is some constant. Our result improves the previous best times O*((k/1.35)^k) for Line Cover (where d=2), O*(k^(dk)) for general Curve Cover, as well as a few other bounds for covering points by parabolas or conics. We also present an algorithm for Hyperplane Cover in R^3 with running time O*((Ck^2/log^(1/5) k)^k), improving on the previous time of O*((k^2/1.3)^k).Comment: SoCG 201

    A Bichromatic Incidence Bound and an Application

    Full text link
    We prove a new, tight upper bound on the number of incidences between points and hyperplanes in Euclidean d-space. Given n points, of which k are colored red, there are O_d(m^{2/3}k^{2/3}n^{(d-2)/3} + kn^{d-2} + m) incidences between the k red points and m hyperplanes spanned by all n points provided that m = \Omega(n^{d-2}). For the monochromatic case k = n, this was proved by Agarwal and Aronov. We use this incidence bound to prove that a set of n points, no more than n-k of which lie on any plane or two lines, spans \Omega(nk^2) planes. We also provide an infinite family of counterexamples to a conjecture of Purdy's on the number of hyperplanes spanned by a set of points in dimensions higher than 3, and present new conjectures not subject to the counterexample.Comment: 12 page

    Incidence Geometries and the Pass Complexity of Semi-Streaming Set Cover

    Full text link
    Set cover, over a universe of size nn, may be modelled as a data-streaming problem, where the mm sets that comprise the instance are to be read one by one. A semi-streaming algorithm is allowed only O(npoly{logn,logm})O(n\, \mathrm{poly}\{\log n, \log m\}) space to process this stream. For each p1p \ge 1, we give a very simple deterministic algorithm that makes pp passes over the input stream and returns an appropriately certified (p+1)n1/(p+1)(p+1)n^{1/(p+1)}-approximation to the optimum set cover. More importantly, we proceed to show that this approximation factor is essentially tight, by showing that a factor better than 0.99n1/(p+1)/(p+1)20.99\,n^{1/(p+1)}/(p+1)^2 is unachievable for a pp-pass semi-streaming algorithm, even allowing randomisation. In particular, this implies that achieving a Θ(logn)\Theta(\log n)-approximation requires Ω(logn/loglogn)\Omega(\log n/\log\log n) passes, which is tight up to the loglogn\log\log n factor. These results extend to a relaxation of the set cover problem where we are allowed to leave an ε\varepsilon fraction of the universe uncovered: the tight bounds on the best approximation factor achievable in pp passes turn out to be Θp(min{n1/(p+1),ε1/p})\Theta_p(\min\{n^{1/(p+1)}, \varepsilon^{-1/p}\}). Our lower bounds are based on a construction of a family of high-rank incidence geometries, which may be thought of as vast generalisations of affine planes. This construction, based on algebraic techniques, appears flexible enough to find other applications and is therefore interesting in its own right.Comment: 20 page

    Dominating sets in projective planes

    Get PDF
    We describe small dominating sets of the incidence graphs of finite projective planes by establishing a stability result which shows that dominating sets are strongly related to blocking and covering sets. Our main result states that if a dominating set in a projective plane of order q>81q>81 is smaller than 2q+2[q]+22q+2[\sqrt{q}]+2 (i.e., twice the size of a Baer subplane), then it contains either all but possibly one points of a line or all but possibly one lines through a point. Furthermore, we completely characterize dominating sets of size at most 2q+q+12q+\sqrt{q}+1. In Desarguesian planes, we could rely on strong stability results on blocking sets to show that if a dominating set is sufficiently smaller than 3q, then it consists of the union of a blocking set and a covering set apart from a few points and lines.Comment: 19 page

    An Improved Point-Line Incidence Bound Over Arbitrary Fields

    Get PDF
    We prove a new upper bound for the number of incidences between points and lines in a plane over an arbitrary field F\mathbb{F}, a problem first considered by Bourgain, Katz and Tao. Specifically, we show that mm points and nn lines in F2\mathbb{F}^2, with m7/8<n<m8/7m^{7/8}<n<m^{8/7}, determine at most O(m11/15n11/15)O(m^{11/15}n^{11/15}) incidences (where, if F\mathbb{F} has positive characteristic pp, we assume m2n13p15m^{-2}n^{13}\ll p^{15}). This improves on the previous best known bound, due to Jones. To obtain our bound, we first prove an optimal point-line incidence bound on Cartesian products, using a reduction to a point-plane incidence bound of Rudnev. We then cover most of the point set with Cartesian products, and we bound the incidences on each product separately, using the bound just mentioned. We give several applications, to sum-product-type problems, an expander problem of Bourgain, the distinct distance problem and Beck's theorem.Comment: 18 pages. To appear in the Bulletin of the London Mathematical Societ

    Thecomposition of semi finished inventories at a solid board plant

    Get PDF
    A solid board factory produces rectangular sheets of cardboard in two different formats, namely large formats and small formats. The production process consists of two stages separated by an inventory point. In the first stage a cardboard machine produces the large formats. In the second stage a part of the large formats is cut into small formats by a separate rotary cut machine. Due to very large setup times, technical restrictions, and trim losses, the cardboard machine is not able to produce these small formats. The company follows two policies to satisfy customer demands for rotary cut format orders. When the company applies the first policy, then for each customer order an ‘optimal’ large format (with respect to trim loss) is determined and produced on the cardboard machine. In case of the second policy, a stock of a restricted number of large formats is determined in such a way that the expected trim loss is minimal. The rotary cut format order then uses the most suitable standard large format from the stock. Currently, the dimensions of the standard large formats in the semi finished inventory are based on intuitive motives, with an accent on minimizing trim losses. From the trim loss perspective it is most efficient to produce each rotary cut format from a specific large format. On the other hand, if there is only one large format in each caliper, the variety is minimal, but the trim loss might be inacceptably high. On average, the first policy results in a lower trim loss. In order to make efficiently use of the two machines and to meet customer’s due times the company applies both policies. In this paper we concentrate on the second policy, taking into account the various objectives and restrictions of the company. The purpose of the company is to have not too many different types of large formats and an acceptable amount of trim loss. The problem is formulated as a minimum clique covering problem with alternatives (MCCA), which is presumed to be NP-hard. We solve the problem by using an appropriate heuristic, which is built into a decision support system. Based on a set of real data, the actual composition of semi finished inventories is determined. The paper concludes with computational experiments.
    corecore