6,979 research outputs found

    New Constant-Weight Codes from Propagation Rules

    Full text link
    This paper proposes some simple propagation rules which give rise to new binary constant-weight codes.Comment: 4 page

    Transitive and self-dual codes attaining the Tsfasman-Vladut-Zink bound

    Get PDF
    A major problem in coding theory is the question of whether the class of cyclic codes is asymptotically good. In this correspondence-as a generalization of cyclic codes-the notion of transitive codes is introduced (see Definition 1.4 in Section I), and it is shown that the class of transitive codes is asymptotically good. Even more, transitive codes attain the Tsfasman-Vladut-Zink bound over F-q, for all squares q = l(2). It is also shown that self-orthogonal and self-dual codes attain the Tsfasman-Vladut-Zink bound, thus improving previous results about self-dual codes attaining the Gilbert-Varshamov bound. The main tool is a new asymptotically optimal tower E-0 subset of E-1 subset of E-2 subset of center dot center dot center dot of function fields over F-q (with q = l(2)), where all extensions E-n/E-0 are Galois

    Codes for Key Generation in Quantum Cryptography

    Full text link
    As an alternative to the usual key generation by two-way communication in schemes for quantum cryptography, we consider codes for key generation by one-way communication. We study codes that could be applied to the raw key sequences that are ideally obtained in recently proposed scenarios for quantum key distribution, which can be regarded as communication through symmetric four-letter channels.Comment: IJQI format, 13 pages, 1 tabl

    Improved asymptotic bounds for codes using distinguished divisors of global function fields

    Full text link
    For a prime power qq, let αq\alpha_q be the standard function in the asymptotic theory of codes, that is, αq(δ)\alpha_q(\delta) is the largest asymptotic information rate that can be achieved for a given asymptotic relative minimum distance δ\delta of qq-ary codes. In recent years the Tsfasman-Vl\u{a}du\c{t}-Zink lower bound on αq(δ)\alpha_q(\delta) was improved by Elkies, Xing, and Niederreiter and \"Ozbudak. In this paper we show further improvements on these bounds by using distinguished divisors of global function fields. We also show improved lower bounds on the corresponding function αqlin\alpha_q^{\rm lin} for linear codes

    A Survey on Metric Learning for Feature Vectors and Structured Data

    Full text link
    The need for appropriate ways to measure the distance or similarity between data is ubiquitous in machine learning, pattern recognition and data mining, but handcrafting such good metrics for specific problems is generally difficult. This has led to the emergence of metric learning, which aims at automatically learning a metric from data and has attracted a lot of interest in machine learning and related fields for the past ten years. This survey paper proposes a systematic review of the metric learning literature, highlighting the pros and cons of each approach. We pay particular attention to Mahalanobis distance metric learning, a well-studied and successful framework, but additionally present a wide range of methods that have recently emerged as powerful alternatives, including nonlinear metric learning, similarity learning and local metric learning. Recent trends and extensions, such as semi-supervised metric learning, metric learning for histogram data and the derivation of generalization guarantees, are also covered. Finally, this survey addresses metric learning for structured data, in particular edit distance learning, and attempts to give an overview of the remaining challenges in metric learning for the years to come.Comment: Technical report, 59 pages. Changes in v2: fixed typos and improved presentation. Changes in v3: fixed typos. Changes in v4: fixed typos and new method
    corecore