3,776 research outputs found

    Relevance of Dynamic Clustering to Biological Networks

    Full text link
    Network of nonlinear dynamical elements often show clustering of synchronization by chaotic instability. Relevance of the clustering to ecological, immune, neural, and cellular networks is discussed, with the emphasis of partially ordered states with chaotic itinerancy. First, clustering with bit structures in a hypercubic lattice is studied. Spontaneous formation and destruction of relevant bits are found, which give self-organizing, and chaotic genetic algorithms. When spontaneous changes of effective couplings are introduced, chaotic itinerancy of clusterings is widely seen through a feedback mechanism, which supports dynamic stability allowing for complexity and diversity, known as homeochaos. Second, synaptic dynamics of couplings is studied in relation with neural dynamics. The clustering structure is formed with a balance between external inputs and internal dynamics. Last, an extension allowing for the growth of the number of elements is given, in connection with cell differentiation. Effective time sharing system of resources is formed in partially ordered states.Comment: submitted to Physica D, no figures include

    Learned Belief-Propagation Decoding with Simple Scaling and SNR Adaptation

    Get PDF
    We consider the weighted belief-propagation (WBP) decoder recently proposed by Nachmani et al. where different weights are introduced for each Tanner graph edge and optimized using machine learning techniques. Our focus is on simple-scaling models that use the same weights across certain edges to reduce the storage and computational burden. The main contribution is to show that simple scaling with few parameters often achieves the same gain as the full parameterization. Moreover, several training improvements for WBP are proposed. For example, it is shown that minimizing average binary cross-entropy is suboptimal in general in terms of bit error rate (BER) and a new "soft-BER" loss is proposed which can lead to better performance. We also investigate parameter adapter networks (PANs) that learn the relation between the signal-to-noise ratio and the WBP parameters. As an example, for the (32,16) Reed-Muller code with a highly redundant parity-check matrix, training a PAN with soft-BER loss gives near-maximum-likelihood performance assuming simple scaling with only three parameters.Comment: 5 pages, 5 figures, submitted to ISIT 201
    • …
    corecore