1,213 research outputs found
Using graphs for the analysis and construction of permutation distance-preserving mappings
Abstract: A new way of looking at permutation distance-preserving mappings (DPMs) is presented by making use of a graph representation. The properties necessary to make such a graph distance-preserving, are also investigated. Further, this new knowledge is used to analyze previous constructions, as well as to construct a new general mapping algorithm for a previous multilevel construction
Analysis of permutation distance-preserving mappings using graphs
Abstract A new way of analyzing permutation distance preserving mappings is presented by making use of a graph representation. The properties necessary to make such graphs distance-preserving and how this relates to the total sum of distances that exist for such mappings, are investigated. This new knowledge is used to analyze previous constructions, as well as showing the existence or non-existence of simple algorithms for mappings attaining the upper bound on the sum of distances. Finally, two applications for such graphs are considered
New DC-free multilevel line codes with spectral nulls at rational submultiples of the symbol frequency
Abstract A new technique of designing multilevel line codes is presented. Distance-preserving mappings from binary sequences to permutation sequences are used, where the permutation sequences also have spectral nulls properties. The resultant line codes use the structure of the convolutional codes and thus have certain advantages compared to other published line codes
New distance concept and graph theory approach for certain coding techniques design and analysis
Abstract: A New graph distance concept introduced for certain coding techniques helped in their design and analysis as in the case of distance-preserving mappings and spectral shaping codes. A graph theoretic construction, mapping binary sequences to permutation sequences and inspired from the k-cube graph has reached the upper bound on the sum of the distances for certain values of the length of the permutation sequence. The new introduced distance concept in the k-cube graph helped better understanding and analyzing for the first time the concept of distance-reducing mappings. A combination of distance and the index-permutation graph concepts helped uncover and verify certain properties of spectral null codes, which were previously difficult to analyze
Constructions of Rank Modulation Codes
Rank modulation is a way of encoding information to correct errors in flash
memory devices as well as impulse noise in transmission lines. Modeling rank
modulation involves construction of packings of the space of permutations
equipped with the Kendall tau distance.
We present several general constructions of codes in permutations that cover
a broad range of code parameters. In particular, we show a number of ways in
which conventional error-correcting codes can be modified to correct errors in
the Kendall space. Codes that we construct afford simple encoding and decoding
algorithms of essentially the same complexity as required to correct errors in
the Hamming metric. For instance, from binary BCH codes we obtain codes
correcting Kendall errors in memory cells that support the order of
messages, for any constant We also construct
families of codes that correct a number of errors that grows with at
varying rates, from to . One of our constructions
gives rise to a family of rank modulation codes for which the trade-off between
the number of messages and the number of correctable Kendall errors approaches
the optimal scaling rate. Finally, we list a number of possibilities for
constructing codes of finite length, and give examples of rank modulation codes
with specific parameters.Comment: Submitted to IEEE Transactions on Information Theor
Multilevel Artificial Neural Network Training for Spatially Correlated Learning
Multigrid modeling algorithms are a technique used to accelerate relaxation
models running on a hierarchy of similar graphlike structures. We introduce and
demonstrate a new method for training neural networks which uses multilevel
methods. Using an objective function derived from a graph-distance metric, we
perform orthogonally-constrained optimization to find optimal prolongation and
restriction maps between graphs. We compare and contrast several methods for
performing this numerical optimization, and additionally present some new
theoretical results on upper bounds of this type of objective function. Once
calculated, these optimal maps between graphs form the core of Multiscale
Artificial Neural Network (MsANN) training, a new procedure we present which
simultaneously trains a hierarchy of neural network models of varying spatial
resolution. Parameter information is passed between members of this hierarchy
according to standard coarsening and refinement schedules from the multiscale
modelling literature. In our machine learning experiments, these models are
able to learn faster than default training, achieving a comparable level of
error in an order of magnitude fewer training examples.Comment: Manuscript (24 pages) and Supplementary Material (4 pages). Updated
January 2019 to reflect new formulation of MsANN structure and new training
procedur
- âŠ