1,712 research outputs found

    Semi-Streaming Set Cover

    Full text link
    This paper studies the set cover problem under the semi-streaming model. The underlying set system is formalized in terms of a hypergraph G=(V,E)G = (V, E) whose edges arrive one-by-one and the goal is to construct an edge cover FEF \subseteq E with the objective of minimizing the cardinality (or cost in the weighted case) of FF. We consider a parameterized relaxation of this problem, where given some 0ϵ<10 \leq \epsilon < 1, the goal is to construct an edge (1ϵ)(1 - \epsilon)-cover, namely, a subset of edges incident to all but an ϵ\epsilon-fraction of the vertices (or their benefit in the weighted case). The key limitation imposed on the algorithm is that its space is limited to (poly)logarithmically many bits per vertex. Our main result is an asymptotically tight trade-off between ϵ\epsilon and the approximation ratio: We design a semi-streaming algorithm that on input graph GG, constructs a succinct data structure D\mathcal{D} such that for every 0ϵ<10 \leq \epsilon < 1, an edge (1ϵ)(1 - \epsilon)-cover that approximates the optimal edge \mbox{(11-)cover} within a factor of f(ϵ,n)f(\epsilon, n) can be extracted from D\mathcal{D} (efficiently and with no additional space requirements), where f(ϵ,n)={O(1/ϵ),if ϵ>1/nO(n),otherwise. f(\epsilon, n) = \left\{ \begin{array}{ll} O (1 / \epsilon), & \text{if } \epsilon > 1 / \sqrt{n} \\ O (\sqrt{n}), & \text{otherwise} \end{array} \right. \, . In particular for the traditional set cover problem we obtain an O(n)O(\sqrt{n})-approximation. This algorithm is proved to be best possible by establishing a family (parameterized by ϵ\epsilon) of matching lower bounds.Comment: Full version of the extended abstract that will appear in Proceedings of ICALP 2014 track

    On Randomized Algorithms for Matching in the Online Preemptive Model

    Full text link
    We investigate the power of randomized algorithms for the maximum cardinality matching (MCM) and the maximum weight matching (MWM) problems in the online preemptive model. In this model, the edges of a graph are revealed one by one and the algorithm is required to always maintain a valid matching. On seeing an edge, the algorithm has to either accept or reject the edge. If accepted, then the adjacent edges are discarded. The complexity of the problem is settled for deterministic algorithms. Almost nothing is known for randomized algorithms. A lower bound of 1.6931.693 is known for MCM with a trivial upper bound of 22. An upper bound of 5.3565.356 is known for MWM. We initiate a systematic study of the same in this paper with an aim to isolate and understand the difficulty. We begin with a primal-dual analysis of the deterministic algorithm due to McGregor. All deterministic lower bounds are on instances which are trees at every step. For this class of (unweighted) graphs we present a randomized algorithm which is 2815\frac{28}{15}-competitive. The analysis is a considerable extension of the (simple) primal-dual analysis for the deterministic case. The key new technique is that the distribution of primal charge to dual variables depends on the "neighborhood" and needs to be done after having seen the entire input. The assignment is asymmetric: in that edges may assign different charges to the two end-points. Also the proof depends on a non-trivial structural statement on the performance of the algorithm on the input tree. The other main result of this paper is an extension of the deterministic lower bound of Varadaraja to a natural class of randomized algorithms which decide whether to accept a new edge or not using independent random choices

    A Time-Space Tradeoff for Triangulations of Points in the Plane

    Get PDF
    In this paper, we consider time-space trade-offs for reporting a triangulation of points in the plane. The goal is to minimize the amount of working space while keeping the total running time small. We present the first multi-pass algorithm on the problem that returns the edges of a triangulation with their adjacency information. This even improves the previously best known random-access algorithm

    Develop and test fuel cell powered on-site integrated total energy system

    Get PDF
    Test results are presented for a 24 cell, two sq ft (4kW) stack. This stack is a precursor to a 25kW stack that is a key milestone. Results are discussed in terms of cell performance, electrolyte management, thermal management, and reactant gas manifolding. The results obtained in preliminary testing of a 50kW methanol processing subsystem are discussed. Subcontracting activities involving application analysis for fuel cell on site integrated energy systems are updated

    Linear Programming in the Semi-streaming Model with Application to the Maximum Matching Problem

    Get PDF
    In this paper, we study linear programming based approaches to the maximum matching problem in the semi-streaming model. The semi-streaming model has gained attention as a model for processing massive graphs as the importance of such graphs has increased. This is a model where edges are streamed-in in an adversarial order and we are allowed a space proportional to the number of vertices in a graph. In recent years, there has been several new results in this semi-streaming model. However broad techniques such as linear programming have not been adapted to this model. We present several techniques to adapt and optimize linear programming based approaches in the semi-streaming model with an application to the maximum matching problem. As a consequence, we improve (almost) all previous results on this problem, and also prove new results on interesting variants

    Born-Regulated Gravity in Four Dimensions

    Get PDF
    Previous work involving Born-regulated gravity theories in two dimensions is extended to four dimensions. The action we consider has two dimensionful parameters. Black hole solutions are studied for typical values of these parameters. For masses above a critical value determined in terms of these parameters, the event horizon persists. For masses below this critical value, the event horizon disappears, leaving a ``bare mass'', though of course no singularity.Comment: LaTeX, 15 pages, 2 figure

    Stochastics theory of log-periodic patterns

    Full text link
    We introduce an analytical model based on birth-death clustering processes to help understanding the empirical log-periodic corrections to power-law scaling and the finite-time singularity as reported in several domains including rupture, earthquakes, world population and financial systems. In our stochastics theory log-periodicities are a consequence of transient clusters induced by an entropy-like term that may reflect the amount of cooperative information carried by the state of a large system of different species. The clustering completion rates for the system are assumed to be given by a simple linear death process. The singularity at t_{o} is derived in terms of birth-death clustering coefficients.Comment: LaTeX, 1 ps figure - To appear J. Phys. A: Math & Ge

    The Vicious Cycle: Fundraising and Perceived Viability in U.S. Presidential Primaries

    Get PDF
    Scholars of presidential primaries have long posited a dynamic positive feedback loop between fundraising and electoral success. Yet existing work on both directions of this feedback remains inconclusive and is often explicitly cross-sectional, ignoring the dynamic aspect of the hypothesis. Pairing high-frequency FEC data on contributions and expenditures with Iowa Electronic Markets data on perceived probability of victory, we examine the bidirectional feedback between contributions and viability. We find robust, significant positive feedback in both directions. This might suggest multiple equilibria: a candidate initially anointed as the front-runner able to sustain such status solely by the fundraising advantage conferred despite possessing no advantage in quality. However, simulations suggest the feedback loop cannot, by itself, sustain advantage. Given the observed durability of front-runners, it would thus seem there is either some other feedback at work and/or the process by which the initial front-runner is identified is informative of candidate quality

    A Protocol for Generating Random Elements with their Probabilities

    Full text link
    We give an AM protocol that allows the verifier to sample elements x from a probability distribution P, which is held by the prover. If the prover is honest, the verifier outputs (x, P(x)) with probability close to P(x). In case the prover is dishonest, one may hope for the following guarantee: if the verifier outputs (x, p), then the probability that the verifier outputs x is close to p. Simple examples show that this cannot be achieved. Instead, we show that the following weaker condition holds (in a well defined sense) on average: If (x, p) is output, then p is an upper bound on the probability that x is output. Our protocol yields a new transformation to turn interactive proofs where the verifier uses private random coins into proofs with public coins. The verifier has better running time compared to the well-known Goldwasser-Sipser transformation (STOC, 1986). For constant-round protocols, we only lose an arbitrarily small constant in soundness and completeness, while our public-coin verifier calls the private-coin verifier only once

    A q-deformation of the Coulomb problem

    Get PDF
    The algebra of observables of SO_{q}(3)-symmetric quantum mechanics is extended to include the inverse \frac{1}{R} of the radial coordinate and used to obtain eigenvalues and eigenfunctions of a \q-deformed Coulomb Hamiltonian
    corecore