8 research outputs found

    Phase Transition Behavior of Cardinality and XOR Constraints

    Full text link
    The runtime performance of modern SAT solvers is deeply connected to the phase transition behavior of CNF formulas. While CNF solving has witnessed significant runtime improvement over the past two decades, the same does not hold for several other classes such as the conjunction of cardinality and XOR constraints, denoted as CARD-XOR formulas. The problem of determining the satisfiability of CARD-XOR formulas is a fundamental problem with a wide variety of applications ranging from discrete integration in the field of artificial intelligence to maximum likelihood decoding in coding theory. The runtime behavior of random CARD-XOR formulas is unexplored in prior work. In this paper, we present the first rigorous empirical study to characterize the runtime behavior of 1-CARD-XOR formulas. We show empirical evidence of a surprising phase-transition that follows a non-linear tradeoff between CARD and XOR constraints

    Efficiently Supporting Hierarchy and Data Updates in DNA Storage

    Full text link
    We propose a novel and flexible DNA-storage architecture that provides the notion of hierarchy among the objects tagged with the same primer pair and enables efficient data updates. In contrast to prior work, in our architecture a pair of PCR primers of length 20 does not define a single object, but an independent storage partition, which is internally managed in an independent way with its own index structure. We make the observation that, while the number of mutually compatible primer pairs is limited, the internal address space available to any pair of primers (i.e., partition) is virtually unlimited. We expose and leverage the flexibility with which this address space can be managed to provide rich and functional storage semantics, such as hierarchical data organization and efficient and flexible implementations of data updates. Furthermore, to leverage the full power of the prefix-based nature of PCR addressing, we define a methodology for transforming an arbitrary indexing scheme into a PCR-compatible equivalent. This allows us to run PCR with primers that can be variably extended to include a desired part of the index, and thus narrow down the scope of the reaction to retrieve a specific object (e.g., file or directory) within the partition with high precision. Our wetlab evaluation demonstrates the practicality of the proposed ideas and shows 140x reduction in sequencing cost retrieval of smaller objects within the partition

    Phase Transition Behavior of Cardinality and XOR Constraints

    Get PDF
    The runtime performance of modern SAT solvers is deeply connected to the phase transition behavior of CNF formulas. While CNF solving has witnessed significant runtime improvement over the past two decades, the same does not hold for several other classes such as the conjunction of cardinality and XOR constraints, denoted as CARD-XOR formulas. The problem of determining satisfiability of CARDXOR formulas is a fundamental problem with wide variety of applications ranging from discrete integration in the field of artificial intelligence to maximum likelihood decoding in coding theory. The runtime behavior of random CARD-XOR formulas is unexplored in prior work. In this paper, we present the first rigorous empirical study to characterize the runtime behavior of 1-CARD-XOR formulas. We show empirical evidence of a surprising phase-transition that follows a non-linear tradeoff between CARD and XOR constraints

    Tolerant Testing of High-Dimensional Samplers with Subcube Conditioning

    Full text link
    We study the tolerant testing problem for high-dimensional samplers. Given as input two samplers P\mathcal{P} and Q\mathcal{Q} over the nn-dimensional space {0,1}n\{0,1\}^n, and two parameters ε2>ε1\varepsilon_2 > \varepsilon_1, the goal of tolerant testing is to test whether the distributions generated by P\mathcal{P} and Q\mathcal{Q} are ε1\varepsilon_1-close or ε2\varepsilon_2-far. Since exponential lower bounds (in nn) are known for the problem in the standard sampling model, research has focused on models where one can draw \textit{conditional} samples. Among these models, \textit{subcube conditioning} (SUBCOND\mathsf{SUBCOND}), which allows conditioning on arbitrary subcubes of the domain, holds the promise of widespread adoption in practice owing to its ability to capture the natural behavior of samplers in constrained domains. To translate the promise into practice, we need to overcome two crucial roadblocks for tests based on SUBCOND\mathsf{SUBCOND}: the prohibitively large number of queries (O~(n5/ε25)\tilde{\mathcal{O}}(n^5/\varepsilon_2^5)) and limitation to non-tolerant testing (i.e., ε1=0\varepsilon_1 = 0). The primary contribution of this work is to overcome the above challenges: we design a new tolerant testing methodology (i.e., ε10\varepsilon_1 \geq 0) that allows us to significantly improve the upper bound to O~(n3/(ε2ε1)5)\tilde{\mathcal{O}}(n^3/(\varepsilon_2-\varepsilon_1)^5)

    Managing reliability skew in DNA storage

    No full text
    10.1145/3470496.3527441ISCA '22: The 49th Annual International Symposium on Computer Architectur

    Proceedings of the 49th Annual International Symposium on Computer Architecture

    No full text
    10.1145/3470496ISCA '22: The 49th Annual International Symposium on Computer Architectur

    Testing Self-Reducible Samplers

    No full text
    Samplers are the back bone of the implementations of any randomised algorithm. Unfortunately, obtaining an efficient algorithm to test the correctness of samplers is very hard to find. Recently, in a series of works, testers like Barbarik, Teq, Flash for testing of some particular kinds of samplers, like CNF-samplers and Horn-samplers, were obtained. But their techniques have a significant limitation because one cannot expect to use their methods to test for other samplers, such as perfect matching samplers or samplers for sampling linear extensions in posets. In this paper, we present a new testing algorithm that works for such samplers and can estimate the distance of a new sampler from a known sampler (say, uniform sampler). Testing the identity of distributions is the heart of testing the correctness of samplers. This paper’s main technical contribution is developing a new distance estimation algorithm for distributions over high-dimensional cubes using the recently proposed sub-cube conditioning sampling model. Given sub-cube conditioning access to an unknown distribution P, and a known distribution Q defined over {0,1}^n, our algorithm CubeProbeEst estimates the variation distance between P and Q within additive error ζ using O(n^2/ζ^4) sub-cube conditional samples from P. Following the testing-via-learning paradigm, we also get a tester which distinguishes between the cases when P and Q are ε-close or η-far in variation distance with probability at least 0.99 using O(n^2/(η−ε)^4) sub-cube conditional samples. The estimation algorithm in the sub-cube conditioning sampling model helps us to design the first tester for self-reducible samplers. The correctness of the testers is formally proved. On the other hand, we implement our algorithm to create CubeProbeEst and use it to test the quality of three samplers for sampling linear extensions in posets.<br/

    Testing Self-Reducible Samplers

    No full text
    Samplers are the backbone of the implementations of any randomized algorithm. Unfortunately, obtaining an efficient algorithm to test the correctness of samplers is very hard to find. Recently, in a series of works, testers like Barbarik, Teq, Flash for testing of some particular kinds of samplers, like CNF-samplers and Horn-samplers, were obtained. However, their techniques have a significant limitation because one can not expect to use their methods to test for other samplers, such as perfect matching samplers or samplers for sampling linear extensions in posets. In this paper, we present a new testing algorithm that works for such samplers and can estimate the distance of a new sampler from a known sampler (say, the uniform sampler). Testing the identity of distributions is the heart of testing the correctness of samplers. This paper's main technical contribution is developing a new distance estimation algorithm for distributions over high-dimensional cubes using the recently proposed subcube conditioning sampling model. Given subcube conditioning access to an unknown distribution P, and a known distribution Q defined over an n-dimensional Boolean hypercube, our algorithm CubeProbeEst estimates the variation distance between P and Q within additive error using subcube conditional samples from P. Following the testing-via-learning paradigm, we also get a tester that distinguishes between the cases when P and Q are close or far in variation distance with high probability using subcube conditional samples. This estimation algorithm CubeProbeEst in the subcube conditioning sampling model helps us to design the first tester for self-reducible samplers. The correctness of the tester is formally proved. Moreover, we implement CubeProbeEst to test the quality of three samplers for sampling linear extensions in posets
    corecore