191 research outputs found

    Finding the Median (Obliviously) with Bounded Space

    Full text link
    We prove that any oblivious algorithm using space SS to find the median of a list of nn integers from {1,...,2n}\{1,...,2n\} requires time Ω(nlog⁥log⁥Sn)\Omega(n \log\log_S n). This bound also applies to the problem of determining whether the median is odd or even. It is nearly optimal since Chan, following Munro and Raman, has shown that there is a (randomized) selection algorithm using only ss registers, each of which can store an input value or O(log⁥n)O(\log n)-bit counter, that makes only O(log⁥log⁥sn)O(\log\log_s n) passes over the input. The bound also implies a size lower bound for read-once branching programs computing the low order bit of the median and implies the analog of P≠NP∩coNPP \ne NP \cap coNP for length o(nlog⁥log⁥n)o(n \log\log n) oblivious branching programs

    The White-Box Adversarial Data Stream Model

    Full text link
    We study streaming algorithms in the white-box adversarial model, where the stream is chosen adaptively by an adversary who observes the entire internal state of the algorithm at each time step. We show that nontrivial algorithms are still possible. We first give a randomized algorithm for the L1L_1-heavy hitters problem that outperforms the optimal deterministic Misra-Gries algorithm on long streams. If the white-box adversary is computationally bounded, we use cryptographic techniques to reduce the memory of our L1L_1-heavy hitters algorithm even further and to design a number of additional algorithms for graph, string, and linear algebra problems. The existence of such algorithms is surprising, as the streaming algorithm does not even have a secret key in this model, i.e., its state is entirely known to the adversary. One algorithm we design is for estimating the number of distinct elements in a stream with insertions and deletions achieving a multiplicative approximation and sublinear space; such an algorithm is impossible for deterministic algorithms. We also give a general technique that translates any two-player deterministic communication lower bound to a lower bound for {\it randomized} algorithms robust to a white-box adversary. In particular, our results show that for all p≄0p\ge 0, there exists a constant Cp>1C_p>1 such that any CpC_p-approximation algorithm for FpF_p moment estimation in insertion-only streams with a white-box adversary requires Ω(n)\Omega(n) space for a universe of size nn. Similarly, there is a constant C>1C>1 such that any CC-approximation algorithm in an insertion-only stream for matrix rank requires Ω(n)\Omega(n) space with a white-box adversary. Our algorithmic results based on cryptography thus show a separation between computationally bounded and unbounded adversaries. (Abstract shortened to meet arXiv limits.)Comment: PODS 202

    Source apportionment of carbonaceous chemical species to fossil fuel combustion, biomass burning and biogenic emissions by a coupled radiocarbon-levoglucosan marker method

    Get PDF
    An intensive aerosol measurement and sample collection campaign was conducted in central Budapest in a mild winter for 2 weeks. The online instruments included an FDMS-TEOM, RT-OC/EC analyser, DMPS, gas pollutant analysers and meteorological sensors. The aerosol samples were collected on quartz fibre filters by a low-volume sampler using the tandem filter method. Elemental carbon (EC), organic carbon (OC), levoglucosan, mannosan, galactosan, arabitol and mannitol were determined, and radiocarbon analysis was performed on the aerosol samples. Median atmospheric concentrations of EC, OC and PM2.5 mass were 0.97, 4.9 and 25 mu g m(-3), respectively. The EC and organic matter (1.6 x OC) accounted for 4.8 and 37 %, respectively, of the PM2.5 mass. Fossil fuel (FF) combustion represented 36% of the total carbon (TC = EC + OC) in the PM2.5 size fraction. Biomass burning (BB) was a major source (40 %) for the OC in the PM2.5 size fraction, and a substantial source (11 %) for the PM10 mass. We propose and apply here a novel, straightforward, coupled radiocarbon-levoglucosan marker method for source apportionment of the major carbonaceous chemical species. The contributions of EC and OC from FF combustion (ECFF and OCFF to the TC were 11.0 and 25 %, respectively, EC and OC from BB (ECBB and OCBB were responsible for 5.8 and 34 %, respectively, of the TC, while the OC from biogenic sources (OCBIO made up 24% of the TC. The overall relative uncertainty of the OCBIO and OCBB contributions was assessed to be up to 30 %, while the relative uncertainty for the other apportioned species is expected to be below 20 %. Evaluation of the apportioned atmospheric concentrations revealed some of their important properties and relationships among them. ECFF and OCFF were associated with different FF combustion sources. Most ECFF was emitted by vehicular road traffic, while the contribution of non-vehicular sources such as domestic and industrial heating or cooking using gas, oil or coal to OCFF was substantial. The mean contribution of BB to EC particles was smaller by a factor of approximately 2 than that of road traffic. The main formation processes of OCFF, OCBB and OCBIO from volatile organic compounds were jointly influenced by a common factor, which is most likely the atmospheric photochemistry, while primary organic emissions can also be important. Technological improvements and control measures for various BB appliances, together with efficient education and training of their users, in particular on the admissible fuel types, offer an important potential for improving the air quality in Budapest, and likely in other cities as well

    The Number of Crossings in Multigraphs with No Empty Lens

    Get PDF

    Picture-Hanging Puzzles

    Get PDF
    We show how to hang a picture by wrapping rope around n nails, making a polynomial number of twists, such that the picture falls whenever any k out of the n nails get removed, and the picture remains hanging when fewer than k nails get removed. This construction makes for some fun mathematical magic performances. More generally, we characterize the possible Boolean functions characterizing when the picture falls in terms of which nails get removed as all monotone Boolean functions. This construction requires an exponential number of twists in the worst case, but exponential complexity is almost always necessary for general functions.Comment: 18 pages, 8 figures, 11 puzzles. Journal version of FUN 2012 pape

    The early evolution of the H-free process

    Full text link
    The H-free process, for some fixed graph H, is the random graph process defined by starting with an empty graph on n vertices and then adding edges one at a time, chosen uniformly at random subject to the constraint that no H subgraph is formed. Let G be the random maximal H-free graph obtained at the end of the process. When H is strictly 2-balanced, we show that for some c>0, with high probability as n→∞n \to \infty, the minimum degree in G is at least cn1−(vH−2)/(eH−1)(log⁥n)1/(eH−1)cn^{1-(v_H-2)/(e_H-1)}(\log n)^{1/(e_H-1)}. This gives new lower bounds for the Tur\'an numbers of certain bipartite graphs, such as the complete bipartite graphs Kr,rK_{r,r} with r≄5r \ge 5. When H is a complete graph KsK_s with s≄5s \ge 5 we show that for some C>0, with high probability the independence number of G is at most Cn2/(s+1)(log⁥n)1−1/(eH−1)Cn^{2/(s+1)}(\log n)^{1-1/(e_H-1)}. This gives new lower bounds for Ramsey numbers R(s,t) for fixed s≄5s \ge 5 and t large. We also obtain new bounds for the independence number of G for other graphs H, including the case when H is a cycle. Our proofs use the differential equations method for random graph processes to analyse the evolution of the process, and give further information about the structure of the graphs obtained, including asymptotic formulae for a broad class of subgraph extension variables.Comment: 36 page

    Bounds for graph regularity and removal lemmas

    Get PDF
    We show, for any positive integer k, that there exists a graph in which any equitable partition of its vertices into k parts has at least ck^2/\log^* k pairs of parts which are not \epsilon-regular, where c,\epsilon>0 are absolute constants. This bound is tight up to the constant c and addresses a question of Gowers on the number of irregular pairs in Szemer\'edi's regularity lemma. In order to gain some control over irregular pairs, another regularity lemma, known as the strong regularity lemma, was developed by Alon, Fischer, Krivelevich, and Szegedy. For this lemma, we prove a lower bound of wowzer-type, which is one level higher in the Ackermann hierarchy than the tower function, on the number of parts in the strong regularity lemma, essentially matching the upper bound. On the other hand, for the induced graph removal lemma, the standard application of the strong regularity lemma, we find a different proof which yields a tower-type bound. We also discuss bounds on several related regularity lemmas, including the weak regularity lemma of Frieze and Kannan and the recently established regular approximation theorem. In particular, we show that a weak partition with approximation parameter \epsilon may require as many as 2^{\Omega(\epsilon^{-2})} parts. This is tight up to the implied constant and solves a problem studied by Lov\'asz and Szegedy.Comment: 62 page

    Computational Indistinguishability between Quantum States and Its Cryptographic Application

    Full text link
    We introduce a computational problem of distinguishing between two specific quantum states as a new cryptographic problem to design a quantum cryptographic scheme that is "secure" against any polynomial-time quantum adversary. Our problem, QSCDff, is to distinguish between two types of random coset states with a hidden permutation over the symmetric group of finite degree. This naturally generalizes the commonly-used distinction problem between two probability distributions in computational cryptography. As our major contribution, we show that QSCDff has three properties of cryptographic interest: (i) QSCDff has a trapdoor; (ii) the average-case hardness of QSCDff coincides with its worst-case hardness; and (iii) QSCDff is computationally at least as hard as the graph automorphism problem in the worst case. These cryptographic properties enable us to construct a quantum public-key cryptosystem, which is likely to withstand any chosen plaintext attack of a polynomial-time quantum adversary. We further discuss a generalization of QSCDff, called QSCDcyc, and introduce a multi-bit encryption scheme that relies on similar cryptographic properties of QSCDcyc.Comment: 24 pages, 2 figures. We improved presentation, and added more detail proofs and follow-up of recent wor

    Graphs drawn with few crossings per edge

    Full text link
    • 

    corecore