70 research outputs found
Improved asymptotic bounds for codes using distinguished divisors of global function fields
For a prime power , let be the standard function in the
asymptotic theory of codes, that is, is the largest
asymptotic information rate that can be achieved for a given asymptotic
relative minimum distance of -ary codes. In recent years the
Tsfasman-Vl\u{a}du\c{t}-Zink lower bound on was improved by
Elkies, Xing, and Niederreiter and \"Ozbudak. In this paper we show further
improvements on these bounds by using distinguished divisors of global function
fields. We also show improved lower bounds on the corresponding function
for linear codes
Transitive and self-dual codes attaining the Tsfasman-Vladut-Zink bound
A major problem in coding theory is the question of whether the class of cyclic codes is asymptotically good. In this correspondence-as a generalization of cyclic codes-the notion of transitive codes is introduced (see Definition 1.4 in Section I), and it is shown that the class of transitive codes is asymptotically good. Even more, transitive codes attain the Tsfasman-Vladut-Zink bound over F-q, for all squares q = l(2). It is also shown that self-orthogonal and self-dual codes attain the Tsfasman-Vladut-Zink bound, thus improving previous results about self-dual codes attaining the Gilbert-Varshamov bound. The main tool is a new asymptotically optimal tower E-0 subset of E-1 subset of E-2 subset of center dot center dot center dot of function fields over F-q (with q = l(2)), where all extensions E-n/E-0 are Galois
Why Philosophers Should Care About Computational Complexity
One might think that, once we know something is computable, how efficiently
it can be computed is a practical question with little further philosophical
importance. In this essay, I offer a detailed case that one would be wrong. In
particular, I argue that computational complexity theory---the field that
studies the resources (such as time, space, and randomness) needed to solve
computational problems---leads to new perspectives on the nature of
mathematical knowledge, the strong AI debate, computationalism, the problem of
logical omniscience, Hume's problem of induction, Goodman's grue riddle, the
foundations of quantum mechanics, economic rationality, closed timelike curves,
and several other topics of philosophical interest. I end by discussing aspects
of complexity theory itself that could benefit from philosophical analysis.Comment: 58 pages, to appear in "Computability: G\"odel, Turing, Church, and
beyond," MIT Press, 2012. Some minor clarifications and corrections; new
references adde
Error-Correction Coding and Decoding: Bounds, Codes, Decoders, Analysis and Applications
Coding; Communications; Engineering; Networks; Information Theory; Algorithm
Hardness of SIS and LWE with Small Parameters
The Short Integer Solution (SIS) and Learning With Errors (LWE) problems are the foundations for countless applications in lattice-based cryptography, and are provably as hard as approximate lattice problems in the worst case. A important question from both a practical and theoretical perspective is how small their parameters can be made, while preserving their hardness.
We prove two main results on SIS and LWE with small parameters. For SIS, we show that the problem retains its hardness for moduli for any constant , where is the bound on the Euclidean norm of the solution. This improves upon prior results which required , and is essentially optimal since the problem is trivially easy for . For LWE, we show that it remains hard even when the errors are small (e.g., uniformly random from ), provided that the number of samples is small enough (e.g., linear in the dimension of the LWE secret). Prior results required the errors to have magnitude at least and to come from a Gaussian-like distribution
- ā¦