1,291 research outputs found
Resolving sets for Johnson and Kneser graphs
A set of vertices in a graph is a {\em resolving set} for if, for
any two vertices , there exists such that the distances . In this paper, we consider the Johnson graphs and Kneser
graphs , and obtain various constructions of resolving sets for these
graphs. As well as general constructions, we show that various interesting
combinatorial objects can be used to obtain resolving sets in these graphs,
including (for Johnson graphs) projective planes and symmetric designs, as well
as (for Kneser graphs) partial geometries, Hadamard matrices, Steiner systems
and toroidal grids.Comment: 23 pages, 2 figures, 1 tabl
Linear spaces with many small lines
AbstractIn this paper some of the work in linear spaces in which most of the lines have few points is surveyed. This includes existence results, blocking sets and embeddings. Also, it is shown that any linear space of order v can be embedded in a linear space of order about 13v in which there are no lines of size 2
CROSSTALK-RESILIANT CODING FOR HIGH DENSITY DIGITAL RECORDING
Increasing the track density in magnetic systems is very difficult due to inter-track interference
(ITI) caused by the magnetic field of adjacent tracks. This work presents a
two-track partial response class 4 magnetic channel with linear and symmetrical ITI; and
explores modulation codes, signal processing methods and error correction codes in order
to mitigate the effects of ITI.
Recording codes were investigated, and a new class of two-dimensional run-length
limited recording codes is described. The new class of codes controls the type of ITI
and has been found to be about 10% more resilient to ITI compared to conventional
run-length limited codes. A new adaptive trellis has also been described that adaptively
solves for the effect of ITI. This has been found to give gains up to 5dB in signal to noise
ratio (SNR) at 40% ITI. It was also found that the new class of codes were about 10%
more resilient to ITI compared to conventional recording codes when decoded with the
new trellis.
Error correction coding methods were applied, and the use of Low Density Parity
Check (LDPC) codes was investigated. It was found that at high SNR, conventional
codes could perform as well as the new modulation codes in a combined modulation and
error correction coding scheme. Results suggest that high rate LDPC codes can mitigate
the effect of ITI, however the decoders have convergence problems beyond 30% ITI
On a problem of Erdős on integers, none of which divides the product of k others
AbstractErdős estimated the maximal number of integers selected from {1,2,…,N}, so that none of them divides the product of two others. In this paper, Erdős’ problem is extended to sets of integers such that none of them divides the product of k others. The proofs use combinatorial results
New Combinatorial Construction Techniques for Low-Density Parity-Check Codes and Systematic Repeat-Accumulate Codes
This paper presents several new construction techniques for low-density
parity-check (LDPC) and systematic repeat-accumulate (RA) codes. Based on
specific classes of combinatorial designs, the improved code design focuses on
high-rate structured codes with constant column weights 3 and higher. The
proposed codes are efficiently encodable and exhibit good structural
properties. Experimental results on decoding performance with the sum-product
algorithm show that the novel codes offer substantial practical application
potential, for instance, in high-speed applications in magnetic recording and
optical communications channels.Comment: 10 pages; to appear in "IEEE Transactions on Communications
Entanglement-assisted quantum low-density parity-check codes
This paper develops a general method for constructing entanglement-assisted
quantum low-density parity-check (LDPC) codes, which is based on combinatorial
design theory. Explicit constructions are given for entanglement-assisted
quantum error-correcting codes (EAQECCs) with many desirable properties. These
properties include the requirement of only one initial entanglement bit, high
error correction performance, high rates, and low decoding complexity. The
proposed method produces infinitely many new codes with a wide variety of
parameters and entanglement requirements. Our framework encompasses various
codes including the previously known entanglement-assisted quantum LDPC codes
having the best error correction performance and many new codes with better
block error rates in simulations over the depolarizing channel. We also
determine important parameters of several well-known classes of quantum and
classical LDPC codes for previously unsettled cases.Comment: 20 pages, 5 figures. Final version appearing in Physical Review
High-Rate Quantum Low-Density Parity-Check Codes Assisted by Reliable Qubits
Quantum error correction is an important building block for reliable quantum information processing. A challenging hurdle in the theory of quantum error correction is that it is significantly more difficult to design error-correcting codes with desirable properties for quantum information processing than for traditional digital communications and computation. A typical obstacle to constructing a variety of strong quantum error-correcting codes is the complicated restrictions imposed on the structure of a code. Recently, promising solutions to this problem have been proposed in quantum information science, where in principle any binary linear code can be turned into a quantum error-correcting code by assuming a small number of reliable quantum bits. This paper studies how best to take advantage of these latest ideas to construct desirable quantum error-correcting codes of very high information rate. Our methods exploit structured high-rate low-density parity-check codes available in the classical domain and provide quantum analogues that inherit their characteristic low decoding complexity and high error correction performance even at moderate code lengths. Our approach to designing high-rate quantum error-correcting codes also allows for making direct use of other major syndrome decoding methods for linear codes, making it possible to deal with a situation where promising quantum analogues of low-density parity-check codes are difficult to find
- …