95 research outputs found

    Uniqueness of Gibbs states of a quantum system on graphs

    Full text link
    Gibbs states of an infinite system of interacting quantum particles are considered. Each particle moves on a compact Riemannian manifold and is attached to a vertex of a graph (one particle per vertex). Two kinds of graphs are studied: (a) a general graph with locally finite degree; (b) a graph with globally bounded degree. In case (a), the uniqueness of Gibbs states is shown under the condition that the interaction potentials are uniformly bounded by a sufficiently small constant. In case (b), the interaction potentials are random. In this case, under a certain condition imposed on the probability distribution of these potentials the almost sure uniqueness of Gibbs states has been shown.Comment: 9 page

    Proof of Space from Stacked Expanders

    Get PDF
    Recently, proof of space (PoS) has been suggested as a more egalitarian alternative to the traditional hash-based proof of work. In PoS, a prover proves to a verifier that it has dedicated some specified amount of space. A closely related notion is memory-hard functions (MHF), functions that require a lot of memory/space to compute. While making promising progress, existing PoS and MHF have several problems. First, there are large gaps between the desired space-hardness and what can be proven. Second, it has been pointed out that PoS and MHF should require a lot of space not just at some point, but throughout the entire computation/protocol; few proposals considered this issue. Third, the two existing PoS constructions are both based on a class of graphs called superconcentrators, which are either hard to construct or add a logarithmic factor overhead to efficiency. In this paper, we construct PoS from stacked expander graphs. Our constructions are simpler, more efficient and have tighter provable space-hardness than prior works. Our results also apply to a recent MHF called Balloon hash. We show Balloon hash has tighter space-hardness than previously believed and consistent space-hardness throughout its computation

    New approaches to reduced-complexity decoding

    Full text link
    We examine new approaches to the problem of decoding general linear codes under the strategies of full or bounded hard decoding and bounded soft decoding. The objective is to derive enhanced new algorithms that take advantage of the major features of existing algorithms to reduce decoding complexity. We derive a wide range of results on the complexity of many existing algorithms. We suggest a new algorithm for cyclic codes, and show how it exploits all the main features of the existing algorithms. Finally, we propose a new approach to the problem of bounded soft decoding, and show that its asymptotic complexity is significantly lower than that of any other currently known general algorithm. In addition, we give a characterization of the weight distribution of the average linear code and thus show that the Gilbert-Varshamov bound is tight for virtually all linear codes over any symbol field.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/29034/1/0000066.pd

    A note on completely regular codes

    No full text

    Nonbinary codes correcting localized errors

    Get PDF
    Ahlswede R, Bassalygo LA, Pinsker MS. Nonbinary codes correcting localized errors. IEEE transactions on information theory. 1993;39(4):1413-1416
    corecore