16,731 research outputs found
Locating influential nodes via dynamics-sensitive centrality
With great theoretical and practical significance, locating influential nodes
of complex networks is a promising issues. In this paper, we propose a
dynamics-sensitive (DS) centrality that integrates topological features and
dynamical properties. The DS centrality can be directly applied in locating
influential spreaders. According to the empirical results on four real networks
for both susceptible-infected-recovered (SIR) and susceptible-infected (SI)
spreading models, the DS centrality is much more accurate than degree,
-shell index and eigenvector centrality.Comment: 6 pages, 1 table and 2 figure
Information filtering via biased heat conduction
Heat conduction process has recently found its application in personalized
recommendation [T. Zhou \emph{et al.}, PNAS 107, 4511 (2010)], which is of high
diversity but low accuracy. By decreasing the temperatures of small-degree
objects, we present an improved algorithm, called biased heat conduction (BHC),
which could simultaneously enhance the accuracy and diversity. Extensive
experimental analyses demonstrate that the accuracy on MovieLens, Netflix and
Delicious datasets could be improved by 43.5%, 55.4% and 19.2% compared with
the standard heat conduction algorithm, and the diversity is also increased or
approximately unchanged. Further statistical analyses suggest that the present
algorithm could simultaneously identify users' mainstream and special tastes,
resulting in better performance than the standard heat conduction algorithm.
This work provides a creditable way for highly efficient information filtering.Comment: 4 pages, 3 figure
Differentiable Programming Tensor Networks
Differentiable programming is a fresh programming paradigm which composes
parameterized algorithmic components and trains them using automatic
differentiation (AD). The concept emerges from deep learning but is not only
limited to training neural networks. We present theory and practice of
programming tensor network algorithms in a fully differentiable way. By
formulating the tensor network algorithm as a computation graph, one can
compute higher order derivatives of the program accurately and efficiently
using AD. We present essential techniques to differentiate through the tensor
networks contractions, including stable AD for tensor decomposition and
efficient backpropagation through fixed point iterations. As a demonstration,
we compute the specific heat of the Ising model directly by taking the second
order derivative of the free energy obtained in the tensor renormalization
group calculation. Next, we perform gradient based variational optimization of
infinite projected entangled pair states for quantum antiferromagnetic
Heisenberg model and obtain start-of-the-art variational energy and
magnetization with moderate efforts. Differentiable programming removes
laborious human efforts in deriving and implementing analytical gradients for
tensor network programs, which opens the door to more innovations in tensor
network algorithms and applications.Comment: Typos corrected, discussion and refs added; revised version accepted
for publication in PRX. Source code available at
https://github.com/wangleiphy/tensorgra
Recommended from our members
Postsynaptic protein organization revealed by electron microscopy.
Neuronal synapses are key devices for transmitting and processing information in the nervous system. Synaptic plasticity, generally regarded as the cellular basis of learning and memory, involves changes of subcellular structures that take place at the nanoscale. High-resolution imaging methods, especially electron microscopy (EM), have allowed for quantitative analysis of such nanoscale structures in different types of synapses. In particular, the semi-ordered organization of neurotransmitter receptors and their interacting scaffolds in the postsynaptic density have been characterized for both excitatory and inhibitory synapses by studies using various EM techniques such as immuno-EM, electron tomography of high-pressure freezing and freeze-substituted samples, and cryo electron tomography. These techniques, in combination with new correlative approaches, will further facilitate our understanding of the molecular organization underlying diverse functions of neuronal synapses
Accumulation of Dense Core Vesicles in Hippocampal Synapses Following Chronic Inactivity.
The morphology and function of neuronal synapses are regulated by neural activity, as manifested in activity-dependent synapse maturation and various forms of synaptic plasticity. Here we employed cryo-electron tomography (cryo-ET) to visualize synaptic ultrastructure in cultured hippocampal neurons and investigated changes in subcellular features in response to chronic inactivity, a paradigm often used for the induction of homeostatic synaptic plasticity. We observed a more than 2-fold increase in the mean number of dense core vesicles (DCVs) in the presynaptic compartment of excitatory synapses and an almost 20-fold increase in the number of DCVs in the presynaptic compartment of inhibitory synapses after 2 days treatment with the voltage-gated sodium channel blocker tetrodotoxin (TTX). Short-term treatment with TTX and the N-methyl-D-aspartate receptor (NMDAR) antagonist amino-5-phosphonovaleric acid (AP5) caused a 3-fold increase in the number of DCVs within 100 nm of the active zone area in excitatory synapses but had no significant effects on the overall number of DCVs. In contrast, there were very few DCVs in the postsynaptic compartments of both synapse types under all conditions. These results are consistent with a role for presynaptic DCVs in activity-dependent synapse maturation. We speculate that these accumulated DCVs can be released upon reactivation and may contribute to homeostatic metaplasticity
- …