16,731 research outputs found

    Locating influential nodes via dynamics-sensitive centrality

    Get PDF
    With great theoretical and practical significance, locating influential nodes of complex networks is a promising issues. In this paper, we propose a dynamics-sensitive (DS) centrality that integrates topological features and dynamical properties. The DS centrality can be directly applied in locating influential spreaders. According to the empirical results on four real networks for both susceptible-infected-recovered (SIR) and susceptible-infected (SI) spreading models, the DS centrality is much more accurate than degree, kk-shell index and eigenvector centrality.Comment: 6 pages, 1 table and 2 figure

    Information filtering via biased heat conduction

    Full text link
    Heat conduction process has recently found its application in personalized recommendation [T. Zhou \emph{et al.}, PNAS 107, 4511 (2010)], which is of high diversity but low accuracy. By decreasing the temperatures of small-degree objects, we present an improved algorithm, called biased heat conduction (BHC), which could simultaneously enhance the accuracy and diversity. Extensive experimental analyses demonstrate that the accuracy on MovieLens, Netflix and Delicious datasets could be improved by 43.5%, 55.4% and 19.2% compared with the standard heat conduction algorithm, and the diversity is also increased or approximately unchanged. Further statistical analyses suggest that the present algorithm could simultaneously identify users' mainstream and special tastes, resulting in better performance than the standard heat conduction algorithm. This work provides a creditable way for highly efficient information filtering.Comment: 4 pages, 3 figure

    Differentiable Programming Tensor Networks

    Full text link
    Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and trains them using automatic differentiation (AD). The concept emerges from deep learning but is not only limited to training neural networks. We present theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher order derivatives of the program accurately and efficiently using AD. We present essential techniques to differentiate through the tensor networks contractions, including stable AD for tensor decomposition and efficient backpropagation through fixed point iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking the second order derivative of the free energy obtained in the tensor renormalization group calculation. Next, we perform gradient based variational optimization of infinite projected entangled pair states for quantum antiferromagnetic Heisenberg model and obtain start-of-the-art variational energy and magnetization with moderate efforts. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.Comment: Typos corrected, discussion and refs added; revised version accepted for publication in PRX. Source code available at https://github.com/wangleiphy/tensorgra

    Accumulation of Dense Core Vesicles in Hippocampal Synapses Following Chronic Inactivity.

    Get PDF
    The morphology and function of neuronal synapses are regulated by neural activity, as manifested in activity-dependent synapse maturation and various forms of synaptic plasticity. Here we employed cryo-electron tomography (cryo-ET) to visualize synaptic ultrastructure in cultured hippocampal neurons and investigated changes in subcellular features in response to chronic inactivity, a paradigm often used for the induction of homeostatic synaptic plasticity. We observed a more than 2-fold increase in the mean number of dense core vesicles (DCVs) in the presynaptic compartment of excitatory synapses and an almost 20-fold increase in the number of DCVs in the presynaptic compartment of inhibitory synapses after 2 days treatment with the voltage-gated sodium channel blocker tetrodotoxin (TTX). Short-term treatment with TTX and the N-methyl-D-aspartate receptor (NMDAR) antagonist amino-5-phosphonovaleric acid (AP5) caused a 3-fold increase in the number of DCVs within 100 nm of the active zone area in excitatory synapses but had no significant effects on the overall number of DCVs. In contrast, there were very few DCVs in the postsynaptic compartments of both synapse types under all conditions. These results are consistent with a role for presynaptic DCVs in activity-dependent synapse maturation. We speculate that these accumulated DCVs can be released upon reactivation and may contribute to homeostatic metaplasticity
    • …
    corecore