5 research outputs found

    Parameters of Combinatorial Neural Codes

    Full text link
    Motivated by recent developments in the mathematical theory of neural codes, we study the structure of error-correcting codes for the binary asymmetric channel. These are also known as combinatorial neural codes and can be seen as the discrete version of neural receptive field codes. We introduce two notions of discrepancy between binary vectors, which are not metric functions in general but nonetheless capture the mathematics of the binary asymmetric channel. In turn, these lead to two new fundamental parameters of combinatorial neural codes, both of which measure the probability that the maximum likelihood decoder fails. We then derive various bounds for the cardinality and weight distribution of a combinatorial neural code in terms of these new parameters, giving examples of codes meeting the bounds with equality

    Colorings of Hamming-Distance Graphs

    Get PDF
    Hamming-distance graphs arise naturally in the study of error-correcting codes and have been utilized by several authors to provide new proofs for (and in some cases improve) known bounds on the size of block codes. We study various standard graph properties of the Hamming-distance graphs with special emphasis placed on the chromatic number. A notion of robustness is defined for colorings of these graphs based on the tolerance of swapping colors along an edge without destroying the properness of the coloring, and a complete characterization of the maximally robust colorings is given for certain parameters. Additionally, explorations are made into subgraph structures whose identification may be useful in determining the chromatic number
    corecore