10,247 research outputs found
Cryptography from Information Loss
© Marshall Ball, Elette Boyle, Akshay Degwekar, Apoorvaa Deshpande, Alon Rosen, Vinod. Reductions between problems, the mainstay of theoretical computer science, efficiently map an instance of one problem to an instance of another in such a way that solving the latter allows solving the former.1 The subject of this work is âlossyâ reductions, where the reduction loses some information about the input instance. We show that such reductions, when they exist, have interesting and powerful consequences for lifting hardness into âusefulâ hardness, namely cryptography. Our first, conceptual, contribution is a definition of lossy reductions in the language of mutual information. Roughly speaking, our definition says that a reduction C is t-lossy if, for any distribution X over its inputs, the mutual information I(X; C(X)) †t. Our treatment generalizes a variety of seemingly related but distinct notions such as worst-case to average-case reductions, randomized encodings (Ishai and Kushilevitz, FOCS 2000), homomorphic computations (Gentry, STOC 2009), and instance compression (Harnik and Naor, FOCS 2006). We then proceed to show several consequences of lossy reductions: 1. We say that a language L has an f-reduction to a language L0 for a Boolean function f if there is a (randomized) polynomial-time algorithm C that takes an m-tuple of strings X = (x1, . . ., xm), with each xi â {0, 1}n, and outputs a string z such that with high probability, L0(z) = f(L(x1), L(x2), . . ., L(xm)) Suppose a language L has an f-reduction C to L0 that is t-lossy. Our first result is that one-way functions exist if L is worst-case hard and one of the following conditions holds: f is the OR function, t †m/100, and L0 is the same as L f is the Majority function, and t †m/100 f is the OR function, t †O(m log n), and the reduction has no error This improves on the implications that follow from combining (Drucker, FOCS 2012) with (Ostrovsky and Wigderson, ISTCS 1993) that result in auxiliary-input one-way functions. 2. Our second result is about the stronger notion of t-compressing f-reductions â reductions that only output t bits. We show that if there is an average-case hard language L that has a t-compressing Majority reduction to some language for t = m/100, then there exist collision-resistant hash functions. This improves on the result of (Harnik and Naor, STOC 2006), whose starting point is a cryptographic primitive (namely, one-way functions) rather than average-case hardness, and whose assumption is a compressing OR-reduction of SAT (which is now known to be false unless the polynomial hierarchy collapses). Along the way, we define a non-standard one-sided notion of average-case hardness, which is the notion of hardness used in the second result above, that may be of independent interest
Limits to Non-Malleability
There have been many successes in constructing explicit non-malleable codes for various classes of tampering functions in recent years, and strong existential results are also known. In this work we ask the following question:
When can we rule out the existence of a non-malleable code for a tampering class ??
First, we start with some classes where positive results are well-known, and show that when these classes are extended in a natural way, non-malleable codes are no longer possible. Specifically, we show that no non-malleable codes exist for any of the following tampering classes:
- Functions that change d/2 symbols, where d is the distance of the code;
- Functions where each input symbol affects only a single output symbol;
- Functions where each of the n output bits is a function of n-log n input bits.
Furthermore, we rule out constructions of non-malleable codes for certain classes ? via reductions to the assumption that a distributional problem is hard for ?, that make black-box use of the tampering functions in the proof. In particular, this yields concrete obstacles for the construction of efficient codes for NC, even assuming average-case variants of P ? NC
Spartan Daily, March 19, 1975
Volume 64, Issue 27https://scholarworks.sjsu.edu/spartandaily/5960/thumbnail.jp
Spartan Daily, March 19, 1975
Volume 64, Issue 27https://scholarworks.sjsu.edu/spartandaily/5960/thumbnail.jp
The Gremlin Graph Traversal Machine and Language
Gremlin is a graph traversal machine and language designed, developed, and
distributed by the Apache TinkerPop project. Gremlin, as a graph traversal
machine, is composed of three interacting components: a graph , a traversal
, and a set of traversers . The traversers move about the graph
according to the instructions specified in the traversal, where the result of
the computation is the ultimate locations of all halted traversers. A Gremlin
machine can be executed over any supporting graph computing system such as an
OLTP graph database and/or an OLAP graph processor. Gremlin, as a graph
traversal language, is a functional language implemented in the user's native
programming language and is used to define the of a Gremlin machine.
This article provides a mathematical description of Gremlin and details its
automaton and functional properties. These properties enable Gremlin to
naturally support imperative and declarative querying, host language
agnosticism, user-defined domain specific languages, an extensible
compiler/optimizer, single- and multi-machine execution models, hybrid depth-
and breadth-first evaluation, as well as the existence of a Universal Gremlin
Machine and its respective entailments.Comment: To appear in the Proceedings of the 2015 ACM Database Programming
Languages Conferenc
School Finance Systems and Their Responsiveness to Performance Pressures: A Case Study of North Carolina
Details the mechanisms of and influences on the state's school finance system, changes caused by increased performance pressures, local officials' views on alternative allocation of resources, and obstacles to linking resources to student learning
- âŠ