4,628 research outputs found
Recent Advances in Graph Partitioning
We survey recent trends in practical algorithms for balanced graph
partitioning together with applications and future research directions
Efficient algorithms for analyzing large scale network dynamics: Centrality, community and predictability
Large scale networks are an indispensable part of our daily life; be it biological network, smart grids, academic collaboration networks, social networks, vehicular networks, or the networks as part of various smart environments, they are fast becoming ubiquitous. The successful realization of applications and services over them depend on efficient solution to their computational challenges that are compounded with network dynamics. The core challenges underlying large scale networks, for example: determining central (influential) nodes (and edges), interactions and contacts among nodes, are the basis behind the success of applications and services. Though at first glance these challenges seem to be trivial, the network characteristics affect their effective and efficient evaluation strategy. We thus propose to leverage large scale network structural characteristics and temporal dynamics in addressing these core conceptual challenges in this dissertation.
We propose a divide and conquer based computationally efficient algorithm that leverages the underlying network community structure for deterministic computation of betweenness centrality indices for all nodes. As an integral part of it, we also propose a computationally efficient agglomerative hierarchical community detection algorithm. Next, we propose a network structure evolution based novel probabilistic link prediction algorithm that predicts set of links occurring over subsequent time periods with higher accuracy. To best capture the evolution process and have higher prediction accuracy we propose multiple time scales with the Markov prediction model. Finally, we propose to capture the multi-periodicity of human mobility pattern with sinusoidal intensity function of a cascaded nonhomogeneous Poisson process, to predict the future contacts over mobile networks. We use real data set and benchmarked approaches to validate the better performance of our proposed approaches --Abstract, page iii
Resolving structural variability in network models and the brain
Large-scale white matter pathways crisscrossing the cortex create a complex
pattern of connectivity that underlies human cognitive function. Generative
mechanisms for this architecture have been difficult to identify in part
because little is known about mechanistic drivers of structured networks. Here
we contrast network properties derived from diffusion spectrum imaging data of
the human brain with 13 synthetic network models chosen to probe the roles of
physical network embedding and temporal network growth. We characterize both
the empirical and synthetic networks using familiar diagnostics presented in
statistical form, as scatter plots and distributions, to reveal the full range
of variability of each measure across scales in the network. We focus on the
degree distribution, degree assortativity, hierarchy, topological Rentian
scaling, and topological fractal scaling---in addition to several summary
statistics, including the mean clustering coefficient, shortest path length,
and network diameter. The models are investigated in a progressive, branching
sequence, aimed at capturing different elements thought to be important in the
brain, and range from simple random and regular networks, to models that
incorporate specific growth rules and constraints. We find that synthetic
models that constrain the network nodes to be embedded in anatomical brain
regions tend to produce distributions that are similar to those extracted from
the brain. We also find that network models hardcoded to display one network
property do not in general also display a second, suggesting that multiple
neurobiological mechanisms might be at play in the development of human brain
network architecture. Together, the network models that we develop and employ
provide a potentially useful starting point for the statistical inference of
brain network structure from neuroimaging data.Comment: 24 pages, 11 figures, 1 table, supplementary material
The cavity approach for Steiner trees packing problems
The Belief Propagation approximation, or cavity method, has been recently
applied to several combinatorial optimization problems in its zero-temperature
implementation, the max-sum algorithm. In particular, recent developments to
solve the edge-disjoint paths problem and the prize-collecting Steiner tree
problem on graphs have shown remarkable results for several classes of graphs
and for benchmark instances. Here we propose a generalization of these
techniques for two variants of the Steiner trees packing problem where multiple
"interacting" trees have to be sought within a given graph. Depending on the
interaction among trees we distinguish the vertex-disjoint Steiner trees
problem, where trees cannot share nodes, from the edge-disjoint Steiner trees
problem, where edges cannot be shared by trees but nodes can be members of
multiple trees. Several practical problems of huge interest in network design
can be mapped into these two variants, for instance, the physical design of
Very Large Scale Integration (VLSI) chips. The formalism described here relies
on two components edge-variables that allows us to formulate a massage-passing
algorithm for the V-DStP and two algorithms for the E-DStP differing in the
scaling of the computational time with respect to some relevant parameters. We
will show that one of the two formalisms used for the edge-disjoint variant
allow us to map the max-sum update equations into a weighted maximum matching
problem over proper bipartite graphs. We developed a heuristic procedure based
on the max-sum equations that shows excellent performance in synthetic networks
(in particular outperforming standard multi-step greedy procedures by large
margins) and on large benchmark instances of VLSI for which the optimal
solution is known, on which the algorithm found the optimum in two cases and
the gap to optimality was never larger than 4 %
- …