1,050 research outputs found

    Eigenvalues, eigenspaces and distances to subsets

    Get PDF
    AbstractIn this note we show how to improve and generalize some calculations of diameters and distances in sufficiently symmetrical graphs, by taking all the eigenvalues of the adjacency matrix of the graph into account. We present some applications of these results to the problem of finding tight upper bounds on the covering radius of error-correcting codes, when the weight distribution of the code (or the dual code) is known

    Convex Graph Invariant Relaxations For Graph Edit Distance

    Get PDF
    The edit distance between two graphs is a widely used measure of similarity that evaluates the smallest number of vertex and edge deletions/insertions required to transform one graph to another. It is NP-hard to compute in general, and a large number of heuristics have been proposed for approximating this quantity. With few exceptions, these methods generally provide upper bounds on the edit distance between two graphs. In this paper, we propose a new family of computationally tractable convex relaxations for obtaining lower bounds on graph edit distance. These relaxations can be tailored to the structural properties of the particular graphs via convex graph invariants. Specific examples that we highlight in this paper include constraints on the graph spectrum as well as (tractable approximations of) the stability number and the maximum-cut values of graphs. We prove under suitable conditions that our relaxations are tight (i.e., exactly compute the graph edit distance) when one of the graphs consists of few eigenvalues. We also validate the utility of our framework on synthetic problems as well as real applications involving molecular structure comparison problems in chemistry.Comment: 27 pages, 7 figure

    Max-plus definite matrix closures and their eigenspaces

    Full text link
    In this paper we introduce the definite closure operation for max-plus matrices with finite permanent, reveal inner structures of definite eigenspaces, and establish some facts about Hilbert distances between these inner structures and the boundary of the definite eigenspaceComment: 20 pages,6 figures, v2: minor changes in figures and in the main tex

    How spiking neurons give rise to a temporal-feature map

    Get PDF
    A temporal-feature map is a topographic neuronal representation of temporal attributes of phenomena or objects that occur in the outside world. We explain the evolution of such maps by means of a spike-based Hebbian learning rule in conjunction with a presynaptically unspecific contribution in that, if a synapse changes, then all other synapses connected to the same axon change by a small fraction as well. The learning equation is solved for the case of an array of Poisson neurons. We discuss the evolution of a temporal-feature map and the synchronization of the single cells’ synaptic structures, in dependence upon the strength of presynaptic unspecific learning. We also give an upper bound for the magnitude of the presynaptic interaction by estimating its impact on the noise level of synaptic growth. Finally, we compare the results with those obtained from a learning equation for nonlinear neurons and show that synaptic structure formation may profit from the nonlinearity

    Embeddability and rate identifiability of Kimura 2-parameter matrices

    Get PDF
    Deciding whether a Markov matrix is embeddable (i.e. can be written as the exponential of a rate matrix) is an open problem even for 4×44\times 4 matrices. We study the embedding problem and rate identifiability for the K80 model of nucleotide substitution. For these 4×44\times 4 matrices, we fully characterize the set of embeddable K80 Markov matrices and the set of embeddable matrices for which rates are identifiable. In particular, we describe an open subset of embeddable matrices with non-identifiable rates. This set contains matrices with positive eigenvalues and also diagonal largest in column matrices, which might lead to consequences in parameter estimation in phylogenetics. Finally, we compute the relative volumes of embeddable K80 matrices and of embeddable matrices with identifiable rates. This study concludes the embedding problem for the more general model K81 and its submodels, which had been initiated by the last two authors in a separate work.Comment: 20 pages; 10 figure
    corecore