2,674 research outputs found

    An Algorithmic Framework for the Generalized Birthday Problem

    Get PDF
    The generalized birthday problem (GBP) was introduced by Wagner in 2002 and has shown to have many applications in cryptanalysis. In its typical variant, we are given access to a function H:{0,1}{0,1}nH:\{0,1\}^{\ell} \rightarrow \{0,1\}^n (whose specification depends on the underlying problem) and an integer K>0K>0. The goal is to find KK distinct inputs to HH (denoted by {xi}i=1K\{x_i\}_{i=1}^{K}) such that i=1KH(xi)=0\sum_{i=1}^{K}H(x_i) = 0. Wagner\u27s K-tree algorithm solves the problem in time and memory complexities of about N1/(logK+1)N^{1/(\lfloor \log K \rfloor + 1)} (where N=2nN= 2^n). Two important open problems raised by Wagner were (1) devise efficient time-memory tradeoffs for GBP, and (2) reduce the complexity of the K-tree algorithm for KK which is not a power of 2. In this paper, we make progress in both directions. First, we improve the best know GBP time-memory tradeoff curve (published by independently by Nikolić and Sasaki and also by Biryukov and Khovratovich) for all K8K \geq 8 from T2MlogK1=NT^2M^{\lfloor \log K \rfloor -1} = N to T(logK)/2+1M(logK)/2=NT^{\lceil (\log K)/2 \rceil + 1 }M^{\lfloor (\log K)/2 \rfloor} = N, applicable for a large range of parameters. For example, for K=8K = 8 we improve the best previous tradeoff from T2M2=NT^2M^2 = N to T3M=NT^3M = N and for K=32K = 32 the improvement is from T2M4=NT^2M^4 = N to T4M2=NT^4M^2 = N. Next, we consider values of KK which are not powers of 2 and show that in many cases even more efficient time-memory tradeoff curves can be obtained. Most interestingly, for K{6,7,14,15}K \in \{6,7,14,15\} we present algorithms with the same time complexities as the K-tree algorithm, but with significantly reduced memory complexities. In particular, for K=6K=6 the K-tree algorithm achieves T=M=N1/3T=M=N^{1/3}, whereas we obtain T=N1/3T=N^{1/3} and M=N1/6M=N^{1/6}. For K=14K=14, Wagner\u27s algorithm achieves T=M=N1/4T=M=N^{1/4}, while we obtain T=N1/4T=N^{1/4} and M=N1/8M=N^{1/8}. This gives the first significant improvement over the K-tree algorithm for small KK. Finally, we optimize our techniques for several concrete GBP instances and show how to solve some of them with improved time and memory complexities compared to the state-of-the-art. Our results are obtained using a framework that combines several algorithmic techniques such as variants of the Schroeppel-Shamir algorithm for solving knapsack problems (devised in works by Howgrave-Graham and Joux and by Becker, Coron and Joux) and dissection algorithms (published by Dinur, Dunkelman, Keller and Shamir). It then builds on these techniques to develop new GBP algorithms

    Fifty years of Hoare's Logic

    Get PDF
    We present a history of Hoare's logic.Comment: 79 pages. To appear in Formal Aspects of Computin

    A Stochastic Complexity Perspective of Induction in Economics and Inference in Dynamics

    Get PDF
    Rissanen's fertile and pioneering minimum description length principle (MDL) has been viewed from the point of view of statistical estimation theory, information theory, as stochastic complexity theory -.i.e., a computable approximation to Kolomogorov Complexity - or Solomonoff's recursion theoretic induction principle or as analogous to Kolmogorov's sufficient statistics. All these - and many more - interpretations are valid, interesting and fertile. In this paper I view it from two points of view: those of an algorithmic economist and a dynamical system theorist. >From these points of view I suggest, first, a recasting of Jevons's sceptical vision of induction in the light of MDL; and a complexity interpretation of an undecidable question in dynamics.

    Ternary Syndrome Decoding with Large Weight

    Get PDF
    The Syndrome Decoding problem is at the core of many code-based cryptosystems. In this paper, we study ternary Syndrome Decoding in large weight. This problem has been introduced in the Wave signature scheme but has never been thoroughly studied. We perform an algorithmic study of this problem which results in an update of the Wave parameters. On a more fundamental level, we show that ternary Syndrome Decoding with large weight is a really harder problem than the binary Syndrome Decoding problem, which could have several applications for the design of code-based cryptosystems

    Linear Time Subgraph Counting, Graph Degeneracy, and the Chasm at Size Six

    Get PDF
    We consider the problem of counting all k-vertex subgraphs in an input graph, for any constant k. This problem (denoted SUB-CNT_k) has been studied extensively in both theory and practice. In a classic result, Chiba and Nishizeki (SICOMP 85) gave linear time algorithms for clique and 4-cycle counting for bounded degeneracy graphs. This is a rich class of sparse graphs that contains, for example, all minor-free families and preferential attachment graphs. The techniques from this result have inspired a number of recent practical algorithms for SUB-CNT_k. Towards a better understanding of the limits of these techniques, we ask: for what values of k can SUB_CNT_k be solved in linear time? We discover a chasm at k=6. Specifically, we prove that for k < 6, SUB_CNT_k can be solved in linear time. Assuming a standard conjecture in fine-grained complexity, we prove that for all k ? 6, SUB-CNT_k cannot be solved even in near-linear time

    Guarantees and Limits of Preprocessing in Constraint Satisfaction and Reasoning

    Full text link
    We present a first theoretical analysis of the power of polynomial-time preprocessing for important combinatorial problems from various areas in AI. We consider problems from Constraint Satisfaction, Global Constraints, Satisfiability, Nonmonotonic and Bayesian Reasoning under structural restrictions. All these problems involve two tasks: (i) identifying the structure in the input as required by the restriction, and (ii) using the identified structure to solve the reasoning task efficiently. We show that for most of the considered problems, task (i) admits a polynomial-time preprocessing to a problem kernel whose size is polynomial in a structural problem parameter of the input, in contrast to task (ii) which does not admit such a reduction to a problem kernel of polynomial size, subject to a complexity theoretic assumption. As a notable exception we show that the consistency problem for the AtMost-NValue constraint admits a polynomial kernel consisting of a quadratic number of variables and domain values. Our results provide a firm worst-case guarantees and theoretical boundaries for the performance of polynomial-time preprocessing algorithms for the considered problems.Comment: arXiv admin note: substantial text overlap with arXiv:1104.2541, arXiv:1104.556

    Satisfiability, sequence niches, and molecular codes in cellular signaling

    Full text link
    Biological information processing as implemented by regulatory and signaling networks in living cells requires sufficient specificity of molecular interaction to distinguish signals from one another, but much of regulation and signaling involves somewhat fuzzy and promiscuous recognition of molecular sequences and structures, which can leave systems vulnerable to crosstalk. This paper examines a simple computational model of protein-protein interactions which reveals both a sharp onset of crosstalk and a fragmentation of the neutral network of viable solutions as more proteins compete for regions of sequence space, revealing intrinsic limits to reliable signaling in the face of promiscuity. These results suggest connections to both phase transitions in constraint satisfaction problems and coding theory bounds on the size of communication codes
    corecore