46,764 research outputs found
Approximating Average Parameters of Graphs
Inspired by Feige (36th STOC, 2004), we initiate a study of sublinear randomized algorithms for approximating average parameters of a graph.
Specifically, we consider the average degree of a graph and the average distance between pairs of vertices in a graph.
Since our focus is on sublinear algorithms, these algorithms access the input graph via queries to an adequate oracle.
We consider two types of queries.
The first type is standard neighborhood queries (i.e., what is the i\u27th neighbor of vertex v?), whereas the second type are queries regarding the quantities that we need to find the average of (i.e., what is the degree of vertex v? and what is the distance between u and v, respectively).
Loosely speaking, our results indicate a difference between the two problems: For approximating the average degree, the standard neighbor queries suffice and in fact are preferable to degree queries. In contrast, for approximating average distances, the standard neighbor queries are of little help whereas distance queries are crucial
Representation Learning on Graphs: A Reinforcement Learning Application
In this work, we study value function approximation in reinforcement learning
(RL) problems with high dimensional state or action spaces via a generalized
version of representation policy iteration (RPI). We consider the limitations
of proto-value functions (PVFs) at accurately approximating the value function
in low dimensions and we highlight the importance of features learning for an
improved low-dimensional value function approximation. Then, we adopt different
representation learning algorithm on graphs to learn the basis functions that
best represent the value function. We empirically show that node2vec, an
algorithm for scalable feature learning in networks, and the Variational Graph
Auto-Encoder constantly outperform the commonly used smooth proto-value
functions in low-dimensional feature space
Approximating Subdense Instances of Covering Problems
We study approximability of subdense instances of various covering problems
on graphs, defined as instances in which the minimum or average degree is
Omega(n/psi(n)) for some function psi(n)=omega(1) of the instance size. We
design new approximation algorithms as well as new polynomial time
approximation schemes (PTASs) for those problems and establish first
approximation hardness results for them. Interestingly, in some cases we were
able to prove optimality of the underlying approximation ratios, under usual
complexity-theoretic assumptions. Our results for the Vertex Cover problem
depend on an improved recursive sampling method which could be of independent
interest
Computing Vertex Centrality Measures in Massive Real Networks with a Neural Learning Model
Vertex centrality measures are a multi-purpose analysis tool, commonly used
in many application environments to retrieve information and unveil knowledge
from the graphs and network structural properties. However, the algorithms of
such metrics are expensive in terms of computational resources when running
real-time applications or massive real world networks. Thus, approximation
techniques have been developed and used to compute the measures in such
scenarios. In this paper, we demonstrate and analyze the use of neural network
learning algorithms to tackle such task and compare their performance in terms
of solution quality and computation time with other techniques from the
literature. Our work offers several contributions. We highlight both the pros
and cons of approximating centralities though neural learning. By empirical
means and statistics, we then show that the regression model generated with a
feedforward neural networks trained by the Levenberg-Marquardt algorithm is not
only the best option considering computational resources, but also achieves the
best solution quality for relevant applications and large-scale networks.
Keywords: Vertex Centrality Measures, Neural Networks, Complex Network Models,
Machine Learning, Regression ModelComment: 8 pages, 5 tables, 2 figures, version accepted at IJCNN 2018. arXiv
admin note: text overlap with arXiv:1810.1176
- …