82 research outputs found

    On the Decoding Complexity of Cyclic Codes Up to the BCH Bound

    Full text link
    The standard algebraic decoding algorithm of cyclic codes [n,k,d][n,k,d] up to the BCH bound tt is very efficient and practical for relatively small nn while it becomes unpractical for large nn as its computational complexity is O(nt)O(nt). Aim of this paper is to show how to make this algebraic decoding computationally more efficient: in the case of binary codes, for example, the complexity of the syndrome computation drops from O(nt)O(nt) to O(tn)O(t\sqrt n), and that of the error location from O(nt)O(nt) to at most max{O(tn),O(t2log(t)log(n))}\max \{O(t\sqrt n), O(t^2\log(t)\log(n))\}.Comment: accepted for publication in Proceedings ISIT 2011. IEEE copyrigh

    Irreducible compositions of degree two polynomials over finite fields have regular structure

    Full text link
    Let qq be an odd prime power and DD be the set of monic irreducible polynomials in Fq[x]\mathbb F_q[x] which can be written as a composition of monic degree two polynomials. In this paper we prove that DD has a natural regular structure by showing that there exists a finite automaton having DD as accepted language. Our method is constructive.Comment: To appear in The Quarterly Journal of Mathematic

    On Taking Square Roots without Quadratic Nonresidues over Finite Fields

    Full text link
    We present a novel idea to compute square roots over finite fields, without being given any quadratic nonresidue, and without assuming any unproven hypothesis. The algorithm is deterministic and the proof is elementary. In some cases, the square root algorithm runs in O~(log2q)\tilde{O}(\log^2 q) bit operations over finite fields with qq elements. As an application, we construct a deterministic primality proving algorithm, which runs in O~(log3N)\tilde{O}(\log^3 N) for some integers NN.Comment: 14 page

    Almost-Uniform Sampling of Points on High-Dimensional Algebraic Varieties

    Get PDF
    We consider the problem of uniform sampling of points on an algebraic variety. Specifically, we develop a randomized algorithm that, given a small set of multivariate polynomials over a sufficiently large finite field, produces a common zero of the polynomials almost uniformly at random. The statistical distance between the output distribution of the algorithm and the uniform distribution on the set of common zeros is polynomially small in the field size, and the running time of the algorithm is polynomial in the description of the polynomials and their degrees provided that the number of the polynomials is a constant

    On non-abelian homomorphic public-key cryptosystems

    Full text link
    An important problem of modern cryptography concerns secret public-key computations in algebraic structures. We construct homomorphic cryptosystems being (secret) epimorphisms f:G --> H, where G, H are (publically known) groups and H is finite. A letter of a message to be encrypted is an element h element of H, while its encryption g element of G is such that f(g)=h. A homomorphic cryptosystem allows one to perform computations (operating in a group G) with encrypted information (without knowing the original message over H). In this paper certain homomorphic cryptosystems are constructed for the first time for non-abelian groups H (earlier, homomorphic cryptosystems were known only in the Abelian case). In fact, we present such a system for any solvable (fixed) group H.Comment: 15 pages, LaTe

    Probabilistic Construction of Normal Basis

    Get PDF
    Let GF(q) be the finite field with q elements. A normal basis polynomial f in GF(q)[x] of degree n is an irreducible polynomial, whose roots form a (normal) basis for the field extension (GF(q^n) : GF(q). We show that a normal basis polynomial of degree n can be found in expected time O(n^(2 + varepsilon) . log(q) + n^(3 + varepsilon) $, when an arithmetic operation and the generation of a random constant in the field GF(q) cost unit time. Given some basis B = alpha_1, alpha_2,..., alpha_n for the field extension GF(qn) : GF(q) together with an algorithm for multiplying two elements in the B-representation in time O(n^beta), we can find a normal basis for this extension and express it in terms of B in expected time O(n^(1 + beta + varepsilon) € log(q) + n^(3 + varepsilon)

    Brief Announcement: Probabilistic Indistinguishability and The Quality of Validity in Byzantine Agreement

    Get PDF
    Lower bounds and impossibility results in distributed computing are both intellectually challenging and practically important. Hundreds if not thousands of proofs appear in the literature, but surprisingly, the vast majority of them apply to deterministic algorithms only. Probabilistic protocols have been around for at least four decades and are receiving a lot of attention with the emergence of blockchain systems. Nonetheless, we are aware of only a handful of randomized lower bounds. In this work we provide a formal framework for reasoning about randomized distributed algorithms. We generalize the notion of indistinguishability, the most useful tool in deterministic lower bounds, to apply to a probabilistic setting. We apply this framework to prove a result of independent interest. Namely, we completely characterize the quality of decisions that protocols for a randomized multi-valued Consensus problem can guarantee in an asynchronous environment with Byzantine faults. We use the new notion to prove a lower bound on the guaranteed probability that honest parties will not decide on a possibly bogus value proposed by a malicious party. Finally, we show that the bound is tight by providing a protocol that matches it. This brief announcement consists of an introduction to the full paper [Guy Goren et al., 2020] by the same title. The interested reader is advised to consult the full paper for a detailed exposition
    corecore