177 research outputs found
A generative model for sparse, evolving digraphs
Generating graphs that are similar to real ones is an open problem, while the
similarity notion is quite elusive and hard to formalize. In this paper, we
focus on sparse digraphs and propose SDG, an algorithm that aims at generating
graphs similar to real ones. Since real graphs are evolving and this evolution
is important to study in order to understand the underlying dynamical system,
we tackle the problem of generating series of graphs. We propose SEDGE, an
algorithm meant to generate series of graphs similar to a real series. SEDGE is
an extension of SDG. We consider graphs that are representations of software
programs and show experimentally that our approach outperforms other existing
approaches. Experiments show the performance of both algorithms
Functional and spatial rewiring jointly generate convergent-divergent units in self-organizing networks
Self-organization through adaptive rewiring of random neural networks
generates brain-like topologies comprising modular small-world structures with
rich club effects, merely as the product of optimizing the network topology. In
the nervous system, spatial organization is optimized no less by rewiring,
through minimizing wiring distance and maximizing spatially aligned wiring
layouts. We show that such spatial organization principles interact
constructively with adaptive rewiring, contributing to establish the networks'
connectedness and modular structures. We use an evolving neural network model
with weighted and directed connections, in which neural traffic flow is based
on consensus and advection dynamics, to show that wiring cost minimization
supports adaptive rewiring in creating convergent-divergent unit structures.
Convergent-divergent units consist of a convergent input-hub, connected to a
divergent output-hub via subnetworks of intermediate nodes, which may function
as the computational core of the unit. The prominence of minimizing wiring
distance in the dynamic evolution of the network determines the extent to which
the core is encapsulated from the rest of the network, i.e., the
context-sensitivity of its computations. This corresponds to the central role
convergent-divergent units play in establishing context-sensitivity in neuronal
information processing
Local Algorithms for Finding Densely Connected Clusters
Local graph clustering is an important algorithmic technique for analysing
massive graphs, and has been widely applied in many research fields of data
science. While the objective of most (local) graph clustering algorithms is to
find a vertex set of low conductance, there has been a sequence of recent
studies that highlight the importance of the inter-connection between clusters
when analysing real-world datasets. Following this line of research, in this
work we study local algorithms for finding a pair of vertex sets defined with
respect to their inter-connection and their relationship with the rest of the
graph. The key to our analysis is a new reduction technique that relates the
structure of multiple sets to a single vertex set in the reduced graph. Among
many potential applications, we show that our algorithms successfully recover
densely connected clusters in the Interstate Disputes Dataset and the US
Migration Dataset.Comment: This work is accepted at ICML'21 for a long presentatio
Controllability in complex brain networks
Complex functional brain networks are large networks of brain regions and functional brain connections. Statistical characterizations of these networks aim to quantify global and local properties of brain activity with a small number of network measures. Recently it has been proposed to characterize brain networks in terms of their "controllability", drawing on concepts and methods of control theory. The thesis will review the control theory for networks and its application in neuroscience. In particular, the study will highlight important limitations and some warning and caveats in the brain controllability framework.ope
- …