64 research outputs found
Fast construction on a restricted budget
We introduce a model of a controlled random graph process. In this model, the
edges of the complete graph are ordered randomly and then revealed, one
by one, to a player called Builder. He must decide, immediately and
irrevocably, whether to purchase each observed edge. The observation time is
bounded by parameter , and the total budget of purchased edges is bounded by
parameter . Builder's goal is to devise a strategy that, with high
probability, allows him to construct a graph of purchased edges possessing a
target graph property , all within the limitations of observation
time and total budget. We show the following: (a) Builder has a strategy to
achieve minimum degree at the hitting time for this property by purchasing
at most edges for an explicit ; and a strategy to achieve it
(slightly) after the threshold for minimum degree by purchasing at most
edges (which is optimal); (b) Builder has a strategy to
create a Hamilton cycle if either and , or and , for some
; similar results hold for perfect matching; (c) Builder has
a strategy to create a copy of a given -vertex tree if , and this is optimal; and (d) For or
, Builder has a strategy to create a copy of a cycle of length
if , and this is optimal.Comment: 20 pages, 2 figure
Faster Random Walks By Rewiring Online Social Networks On-The-Fly
Many online social networks feature restrictive web interfaces which only
allow the query of a user's local neighborhood through the interface. To enable
analytics over such an online social network through its restrictive web
interface, many recent efforts reuse the existing Markov Chain Monte Carlo
methods such as random walks to sample the social network and support analytics
based on the samples. The problem with such an approach, however, is the large
amount of queries often required (i.e., a long "mixing time") for a random walk
to reach a desired (stationary) sampling distribution.
In this paper, we consider a novel problem of enabling a faster random walk
over online social networks by "rewiring" the social network on-the-fly.
Specifically, we develop Modified TOpology (MTO)-Sampler which, by using only
information exposed by the restrictive web interface, constructs a "virtual"
overlay topology of the social network while performing a random walk, and
ensures that the random walk follows the modified overlay topology rather than
the original one. We show that MTO-Sampler not only provably enhances the
efficiency of sampling, but also achieves significant savings on query cost
over real-world online social networks such as Google Plus, Epinion etc.Comment: 15 pages, 14 figure, technical report for ICDE2013 paper. Appendix
has all the theorems' proofs; ICDE'201
Recovering Sparse Signals Using Sparse Measurement Matrices in Compressed DNA Microarrays
Microarrays (DNA, protein, etc.) are massively parallel affinity-based biosensors capable of detecting and quantifying a large number of different genomic particles simultaneously. Among them, DNA microarrays comprising tens of thousands of probe spots are currently being employed to test multitude of targets in a single experiment. In conventional microarrays, each spot contains a large number of copies of a single probe designed to capture a single target, and, hence, collects only a single data point. This is a wasteful use of the sensing resources in comparative DNA microarray experiments, where a test sample is measured relative to a reference sample. Typically, only a fraction of the total number of genes represented by the two samples is differentially expressed, and, thus, a vast number of probe spots may not provide any useful information. To this end, we propose an alternative design, the so-called compressed microarrays, wherein each spot contains copies of several different probes and the total number of spots is potentially much smaller than the number of targets being tested. Fewer spots directly translates to significantly lower costs due to cheaper array manufacturing, simpler image acquisition and processing, and smaller amount of genomic material needed for experiments. To recover signals from compressed microarray measurements, we leverage ideas from compressive sampling. For sparse measurement matrices, we propose an algorithm that has significantly lower computational complexity than the widely used linear-programming-based methods, and can also recover signals with less sparsity
Deterministic Decremental Reachability, SCC, and Shortest Paths via Directed Expanders and Congestion Balancing
Let be a weighted, digraph subject to a sequence of adversarial
edge deletions. In the decremental single-source reachability problem (SSR), we
are given a fixed source and the goal is to maintain a data structure that
can answer path-queries for any . In the more
general single-source shortest paths (SSSP) problem the goal is to return an
approximate shortest path to , and in the SCC problem the goal is to
maintain strongly connected components of and to answer path queries within
each component. All of these problems have been very actively studied over the
past two decades, but all the fast algorithms are randomized and, more
significantly, they can only answer path queries if they assume a weaker model:
they assume an oblivious adversary which is not adaptive and must fix the
update sequence in advance. This assumption significantly limits the use of
these data structures, most notably preventing them from being used as
subroutines in static algorithms. All the above problems are notoriously
difficult in the adaptive setting. In fact, the state-of-the-art is still the
Even and Shiloach tree, which dates back all the way to 1981 and achieves total
update time . We present the first algorithms to break through this
barrier:
1) deterministic decremental SSR/SCC with total update time
2) deterministic decremental SSSP with total update time .
To achieve these results, we develop two general techniques of broader
interest for working with dynamic graphs: 1) a generalization of expander-based
tools to dynamic directed graphs, and 2) a technique that we call congestion
balancing and which provides a new method for maintaining flow under
adversarial deletions. Using the second technique, we provide the first
near-optimal algorithm for decremental bipartite matching.Comment: Reuploaded with some generalizations of previous theorem
Factor-of-iid balanced orientation of non-amenable graphs
We show that if a non-amenable, quasi-transitive, unimodular graph has
all degrees even then it has a factor-of-iid balanced orientation, meaning each
vertex has equal in- and outdegree. This result involves extending earlier
spectral-theoretic results on Bernoulli shifts to the Bernoulli graphings of
quasi-transitive, unimodular graphs. As a consequence, we also obtain that when
is regular (of either odd or even degree) and bipartite, it has a
factor-of-iid perfect matching. This generalizes a result of Lyons and Nazarov
beyond transitive graphs.Comment: 24 pages, 1 figure. This is one of two papers that are replacing the
shorter arXiv submission arXiv:2101.12577v1 Factor of iid Schreier decoration
of transitive graph
- …