20,365 research outputs found

    Optimal Base Station Placement: A Stochastic Method Using Interference Gradient In Downlink Case

    Get PDF
    In this paper, we study the optimal placement and optimal number of base stations added to an existing wireless data network through the interference gradient method. This proposed method considers a sub-region of the existing wireless data network, hereafter called region of interest. In this region, the provider wants to increase the network coverage and the users throughput. In this aim, the provider needs to determine the optimal number of base stations to be added and their optimal placement. The proposed approach is based on the Delaunay triangulation of the region of interest and the gradient descent method in each triangle to compute the minimum interference locations. We quantify the increase of coverage and throughput.Comment: This work has been presented in the 5th International ICST Conference on Performance Evaluation Methodologies and Tools (Valuetools 2011

    On a Linear Program for Minimum-Weight Triangulation

    Get PDF
    Minimum-weight triangulation (MWT) is NP-hard. It has a polynomial-time constant-factor approximation algorithm, and a variety of effective polynomial- time heuristics that, for many instances, can find the exact MWT. Linear programs (LPs) for MWT are well-studied, but previously no connection was known between any LP and any approximation algorithm or heuristic for MWT. Here we show the first such connections: for an LP formulation due to Dantzig et al. (1985): (i) the integrality gap is bounded by a constant; (ii) given any instance, if the aforementioned heuristics find the MWT, then so does the LP.Comment: To appear in SICOMP. Extended abstract appeared in SODA 201

    Optimizing the double description method for normal surface enumeration

    Full text link
    Many key algorithms in 3-manifold topology involve the enumeration of normal surfaces, which is based upon the double description method for finding the vertices of a convex polytope. Typically we are only interested in a small subset of these vertices, thus opening the way for substantial optimization. Here we give an account of the vertex enumeration problem as it applies to normal surfaces, and present new optimizations that yield strong improvements in both running time and memory consumption. The resulting algorithms are tested using the freely available software package Regina.Comment: 27 pages, 12 figures; v2: Removed the 3^n bound from Section 3.3, fixed the projective equation in Lemma 4.4, clarified "most triangulations" in the introduction to section 5; v3: replace -ise with -ize for Mathematics of Computation (note that this changes the title of the paper

    Drawing Planar Graphs with Few Geometric Primitives

    Get PDF
    We define the \emph{visual complexity} of a plane graph drawing to be the number of basic geometric objects needed to represent all its edges. In particular, one object may represent multiple edges (e.g., one needs only one line segment to draw a path with an arbitrary number of edges). Let nn denote the number of vertices of a graph. We show that trees can be drawn with 3n/43n/4 straight-line segments on a polynomial grid, and with n/2n/2 straight-line segments on a quasi-polynomial grid. Further, we present an algorithm for drawing planar 3-trees with (8n17)/3(8n-17)/3 segments on an O(n)×O(n2)O(n)\times O(n^2) grid. This algorithm can also be used with a small modification to draw maximal outerplanar graphs with 3n/23n/2 edges on an O(n)×O(n2)O(n)\times O(n^2) grid. We also study the problem of drawing maximal planar graphs with circular arcs and provide an algorithm to draw such graphs using only (5n11)/3(5n - 11)/3 arcs. This is significantly smaller than the lower bound of 2n2n for line segments for a nontrivial graph class.Comment: Appeared at Proc. 43rd International Workshop on Graph-Theoretic Concepts in Computer Science (WG 2017

    \v{C}ech-Delaunay gradient flow and homology inference for self-maps

    Full text link
    We call a continuous self-map that reveals itself through a discrete set of point-value pairs a sampled dynamical system. Capturing the available information with chain maps on Delaunay complexes, we use persistent homology to quantify the evidence of recurrent behavior. We establish a sampling theorem to recover the eigenspace of the endomorphism on homology induced by the self-map. Using a combinatorial gradient flow arising from the discrete Morse theory for \v{C}ech and Delaunay complexes, we construct a chain map to transform the problem from the natural but expensive \v{C}ech complexes to the computationally efficient Delaunay triangulations. The fast chain map algorithm has applications beyond dynamical systems.Comment: 22 pages, 8 figure

    Combining information seeking services into a meta supply chain of facts

    Get PDF
    The World Wide Web has become a vital supplier of information that allows organizations to carry on such tasks as business intelligence, security monitoring, and risk assessments. Having a quick and reliable supply of correct facts from perspective is often mission critical. By following design science guidelines, we have explored ways to recombine facts from multiple sources, each with possibly different levels of responsiveness and accuracy, into one robust supply chain. Inspired by prior research on keyword-based meta-search engines (e.g., metacrawler.com), we have adapted the existing question answering algorithms for the task of analysis and triangulation of facts. We present a first prototype for a meta approach to fact seeking. Our meta engine sends a user's question to several fact seeking services that are publicly available on the Web (e.g., ask.com, brainboost.com, answerbus.com, NSIR, etc.) and analyzes the returned results jointly to identify and present to the user those that are most likely to be factually correct. The results of our evaluation on the standard test sets widely used in prior research support the evidence for the following: 1) the value-added of the meta approach: its performance surpasses the performance of each supplier, 2) the importance of using fact seeking services as suppliers to the meta engine rather than keyword driven search portals, and 3) the resilience of the meta approach: eliminating a single service does not noticeably impact the overall performance. We show that these properties make the meta-approach a more reliable supplier of facts than any of the currently available stand-alone services
    corecore