4,908 research outputs found
Measuring complexity with zippers
Physics concepts have often been borrowed and independently developed by
other fields of science. In this perspective a significant example is that of
entropy in Information Theory. The aim of this paper is to provide a short and
pedagogical introduction to the use of data compression techniques for the
estimate of entropy and other relevant quantities in Information Theory and
Algorithmic Information Theory. We consider in particular the LZ77 algorithm as
case study and discuss how a zipper can be used for information extraction.Comment: 10 pages, 3 figure
Designing a resource-efficient data structure for mobile data systems
Designing data structures for use in mobile devices requires attention on optimising data volumes with associated benefits for data transmission, storage space and battery use. For semi-structured data, tree summarisation techniques can be used to reduce the volume of structured elements while dictionary compression can efficiently deal with value-based predicates. This project seeks to investigate and evaluate an integration of the two approaches. The key strength of this technique is that both structural and value predicates could be resolved within one graph while further allowing for compression of the resulting data structure. As the current trend is towards the requirement for working with larger semi-structured data sets this work would allow for the utilisation of much larger data sets whilst reducing requirements on bandwidth and minimising the memory necessary both for the storage and querying of the data
Contextualized Graph Attention Network for Recommendation with Item Knowledge Graph
Graph neural networks (GNN) have recently been applied to exploit knowledge
graph (KG) for recommendation. Existing GNN-based methods explicitly model the
dependency between an entity and its local graph context in KG (i.e., the set
of its first-order neighbors), but may not be effective in capturing its
non-local graph context (i.e., the set of most related high-order neighbors).
In this paper, we propose a novel recommendation framework, named
Contextualized Graph Attention Network (CGAT), which can explicitly exploit
both local and non-local graph context information of an entity in KG.
Specifically, CGAT captures the local context information by a user-specific
graph attention mechanism, considering a user's personalized preferences on
entities. Moreover, CGAT employs a biased random walk sampling process to
extract the non-local context of an entity, and utilizes a Recurrent Neural
Network (RNN) to model the dependency between the entity and its non-local
contextual entities. To capture the user's personalized preferences on items,
an item-specific attention mechanism is also developed to model the dependency
between a target item and the contextual items extracted from the user's
historical behaviors. Experimental results on real datasets demonstrate the
effectiveness of CGAT, compared with state-of-the-art KG-based recommendation
methods
Control Plane Compression
We develop an algorithm capable of compressing large networks into a smaller
ones with similar control plane behavior: For every stable routing solution in
the large, original network, there exists a corresponding solution in the
compressed network, and vice versa. Our compression algorithm preserves a wide
variety of network properties including reachability, loop freedom, and path
length. Consequently, operators may speed up network analysis, based on
simulation, emulation, or verification, by analyzing only the compressed
network. Our approach is based on a new theory of control plane equivalence. We
implement these ideas in a tool called Bonsai and apply it to real and
synthetic networks. Bonsai can shrink real networks by over a factor of 5 and
speed up analysis by several orders of magnitude.Comment: Extended version of the paper appearing in ACM SIGCOMM 201
- …