742 research outputs found

    Sign problem in the Bethe approximation

    Full text link
    We propose a message-passing algorithm to compute the Hamiltonian expectation with respect to an appropriate class of trial wave functions for an interacting system of fermions. To this end, we connect the quantum expectations to average quantities in a classical system with both local and global interactions, which are related to the variational parameters and use the Bethe approximation to estimate the average energy within the replica-symmetric approximation. The global interactions, which are needed to obtain a good estimation of the average fermion sign, make the average energy a nonlocal function of the variational parameters. We use some heuristic minimization algorithms to find approximate ground states of the Hubbard model on random regular graphs and observe significant qualitative improvements with respect to the mean-field approximation.Comment: 19 pages, 9 figures, one figure adde

    A rigorous analysis of the cavity equations for the minimum spanning tree

    Full text link
    We analyze a new general representation for the Minimum Weight Steiner Tree (MST) problem which translates the topological connectivity constraint into a set of local conditions which can be analyzed by the so called cavity equations techniques. For the limit case of the Spanning tree we prove that the fixed point of the algorithm arising from the cavity equations leads to the global optimum.Comment: 5 pages, 1 figur

    Inference and learning in sparse systems with multiple states

    Full text link
    We discuss how inference can be performed when data are sampled from the non-ergodic phase of systems with multiple attractors. We take as model system the finite connectivity Hopfield model in the memory phase and suggest a cavity method approach to reconstruct the couplings when the data are separately sampled from few attractor states. We also show how the inference results can be converted into a learning protocol for neural networks in which patterns are presented through weak external fields. The protocol is simple and fully local, and is able to store patterns with a finite overlap with the input patterns without ever reaching a spin glass phase where all memories are lost.Comment: 15 pages, 10 figures, to be published in Phys. Rev.

    Ferromagnetic ordering in graphs with arbitrary degree distribution

    Full text link
    We present a detailed study of the phase diagram of the Ising model in random graphs with arbitrary degree distribution. By using the replica method we compute exactly the value of the critical temperature and the associated critical exponents as a function of the minimum and maximum degree, and the degree distribution characterizing the graph. As expected, there is a ferromagnetic transition provided < \infty. However, if the fourth moment of the degree distribution is not finite then non-trivial scaling exponents are obtained. These results are analyzed for the particular case of power-law distributed random graphs.Comment: 9 pages, 1 figur

    Encoding for the Blackwell Channel with Reinforced Belief Propagation

    Full text link
    A key idea in coding for the broadcast channel (BC) is binning, in which the transmitter encode information by selecting a codeword from an appropriate bin (the messages are thus the bin indexes). This selection is normally done by solving an appropriate (possibly difficult) combinatorial problem. Recently it has been shown that binning for the Blackwell channel --a particular BC-- can be done by iterative schemes based on Survey Propagation (SP). This method uses decimation for SP and suffers a complexity of O(n^2). In this paper we propose a new variation of the Belief Propagation (BP) algorithm, named Reinforced BP algorithm, that turns BP into a solver. Our simulations show that this new algorithm has complexity O(n log n). Using this new algorithm together with a non-linear coding scheme, we can efficiently achieve rates close to the border of the capacity region of the Blackwell channel.Comment: 5 pages, 8 figures, submitted to ISIT 200

    Clustering with shallow trees

    Full text link
    We propose a new method for hierarchical clustering based on the optimisation of a cost function over trees of limited depth, and we derive a message--passing method that allows to solve it efficiently. The method and algorithm can be interpreted as a natural interpolation between two well-known approaches, namely single linkage and the recently presented Affinity Propagation. We analyze with this general scheme three biological/medical structured datasets (human population based on genetic information, proteins based on sequences and verbal autopsies) and show that the interpolation technique provides new insight.Comment: 11 pages, 7 figure

    Quantum Dynamics of Coupled Bosonic Wells within the Bose-Hubbard Picture

    Get PDF
    We relate the quantum dynamics of the Bose-Hubbard model (BHM) to the semiclassical nonlinear equations that describe an array of interacting Bose condensates by implementing a standard variational procedure based on the coherent state method. We investigate the dynamics of the two-site BHM from the purely quantum viewpoint by recasting first the model within a spin picture and using then the related dynamical algebra. The latter allows us to study thoroughly the energy spectrum structure and to interpret quantally the classical symmetries of the two-site dynamics. The energy spectrum is also evaluated through various approximations relying on the coherent state approach.Comment: 22 pages, 7 figure

    Large deviations of cascade processes on graphs

    Full text link
    Simple models of irreversible dynamical processes such as Bootstrap Percolation have been successfully applied to describe cascade processes in a large variety of different contexts. However, the problem of analyzing non-typical trajectories, which can be crucial for the understanding of the out-of-equilibrium phenomena, is still considered to be intractable in most cases. Here we introduce an efficient method to find and analyze optimized trajectories of cascade processes. We show that for a wide class of irreversible dynamical rules, this problem can be solved efficiently on large-scale systems

    Input-driven unsupervised learning in recurrent neural networks

    Get PDF
    Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is an attractor neural network with Hebbian learning (e.g. the Hopfield model). The model simplicity and the locality of the synaptic update rules come at the cost of a limited storage capacity, compared with the capacity achieved with supervised learning algorithms, whose biological plausibility is questionable. Here, we present an on-line learning rule for a recurrent neural network that achieves near-optimal performance without an explicit supervisory error signal and using only locally accessible information, and which is therefore biologically plausible. The fully connected network consists of excitatory units with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the patterns to be memorized are presented on-line as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs ('local fields'). Synapses corresponding to active inputs are modified as a function of the position of the local field with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. An additional parameter of the model allows to trade storage capacity for robustness, i.e. increased size of the basins of attraction. We simulated a network of 1001 excitatory neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction: our results show that, for any given basin size, our network more than doubles the storage capacity, compared with a standard Hopfield network. Our learning rule is consistent with available experimental data documenting how plasticity depends on firing rate. It predicts that at high enough firing rates, no potentiation should occu
    • …
    corecore