2,352 research outputs found
Layered Label Propagation: A MultiResolution Coordinate-Free Ordering for Compressing Social Networks
We continue the line of research on graph compression started with WebGraph,
but we move our focus to the compression of social networks in a proper sense
(e.g., LiveJournal): the approaches that have been used for a long time to
compress web graphs rely on a specific ordering of the nodes (lexicographical
URL ordering) whose extension to general social networks is not trivial. In
this paper, we propose a solution that mixes clusterings and orders, and devise
a new algorithm, called Layered Label Propagation, that builds on previous work
on scalable clustering and can be used to reorder very large graphs (billions
of nodes). Our implementation uses overdecomposition to perform aggressively on
multi-core architecture, making it possible to reorder graphs of more than 600
millions nodes in a few hours. Experiments performed on a wide array of web
graphs and social networks show that combining the order produced by the
proposed algorithm with the WebGraph compression framework provides a major
increase in compression with respect to all currently known techniques, both on
web graphs and on social networks. These improvements make it possible to
analyse in main memory significantly larger graphs
Foam evaluation and Kronheimer--Mrowka theories
We introduce and study combinatorial equivariant analogues of the
Kronheimer--Mrowka homology theory of planar trivalent graphs.Comment: 53 pages, 23 tikz figure
On Counting Triangles through Edge Sampling in Large Dynamic Graphs
Traditional frameworks for dynamic graphs have relied on processing only the
stream of edges added into or deleted from an evolving graph, but not any
additional related information such as the degrees or neighbor lists of nodes
incident to the edges. In this paper, we propose a new edge sampling framework
for big-graph analytics in dynamic graphs which enhances the traditional model
by enabling the use of additional related information. To demonstrate the
advantages of this framework, we present a new sampling algorithm, called Edge
Sample and Discard (ESD). It generates an unbiased estimate of the total number
of triangles, which can be continuously updated in response to both edge
additions and deletions. We provide a comparative analysis of the performance
of ESD against two current state-of-the-art algorithms in terms of accuracy and
complexity. The results of the experiments performed on real graphs show that,
with the help of the neighborhood information of the sampled edges, the
accuracy achieved by our algorithm is substantially better. We also
characterize the impact of properties of the graph on the performance of our
algorithm by testing on several Barabasi-Albert graphs.Comment: A short version of this article appeared in Proceedings of the 2017
IEEE/ACM International Conference on Advances in Social Networks Analysis and
Mining (ASONAM 2017
Dual graphs of exceptional divisors
Let p be a singular point of a variety. Consider a resolution where the
preimage of p is a simple normal crossing divisor E. The combinatorial
structure of E is described by a cell complex D(E), called the dual graph or
dual complex of E. It is known that the homotopy type of D(E) depends only on
p, not on the resolution chosen.
We prove that this homotopy type can be arbitrary. We also describe which
homotopy types can be obtained from rational singularities
- …