144,501 research outputs found
The cavity method for large deviations
A method is introduced for studying large deviations in the context of
statistical physics of disordered systems. The approach, based on an extension
of the cavity method to atypical realizations of the quenched disorder, allows
us to compute exponentially small probabilities (rate functions) over different
classes of random graphs. It is illustrated with two combinatorial optimization
problems, the vertex-cover and coloring problems, for which the presence of
replica symmetry breaking phases is taken into account. Applications include
the analysis of models on adaptive graph structures.Comment: 18 pages, 7 figure
Complexity Analysis and Efficient Measurement Selection Primitives for High-Rate Graph SLAM
Sparsity has been widely recognized as crucial for efficient optimization in
graph-based SLAM. Because the sparsity and structure of the SLAM graph reflect
the set of incorporated measurements, many methods for sparsification have been
proposed in hopes of reducing computation. These methods often focus narrowly
on reducing edge count without regard for structure at a global level. Such
structurally-naive techniques can fail to produce significant computational
savings, even after aggressive pruning. In contrast, simple heuristics such as
measurement decimation and keyframing are known empirically to produce
significant computation reductions. To demonstrate why, we propose a
quantitative metric called elimination complexity (EC) that bridges the
existing analytic gap between graph structure and computation. EC quantifies
the complexity of the primary computational bottleneck: the factorization step
of a Gauss-Newton iteration. Using this metric, we show rigorously that
decimation and keyframing impose favorable global structures and therefore
achieve computation reductions on the order of and , respectively,
where is the pruning rate. We additionally present numerical results
showing EC provides a good approximation of computation in both batch and
incremental (iSAM2) optimization and demonstrate that pruning methods promoting
globally-efficient structure outperform those that do not.Comment: Pre-print accepted to ICRA 201
Statistical Mechanics of maximal independent sets
The graph theoretic concept of maximal independent set arises in several
practical problems in computer science as well as in game theory. A maximal
independent set is defined by the set of occupied nodes that satisfy some
packing and covering constraints. It is known that finding minimum and
maximum-density maximal independent sets are hard optimization problems. In
this paper, we use cavity method of statistical physics and Monte Carlo
simulations to study the corresponding constraint satisfaction problem on
random graphs. We obtain the entropy of maximal independent sets within the
replica symmetric and one-step replica symmetry breaking frameworks, shedding
light on the metric structure of the landscape of solutions and suggesting a
class of possible algorithms. This is of particular relevance for the
application to the study of strategic interactions in social and economic
networks, where maximal independent sets correspond to pure Nash equilibria of
a graphical game of public goods allocation
Ground state of the Bethe-lattice spin glass and running time of an exact optimization algorithm
We study the Ising spin glass on random graphs with fixed connectivity z and
with a Gaussian distribution of the couplings, with mean \mu and unit variance.
We compute exact ground states by using a sophisticated branch-and-cut method
for z=4,6 and system sizes up to N=1280 for different values of \mu. We locate
the spin-glass/ferromagnet phase transition at \mu = 0.77 +/- 0.02 (z=4) and
\mu = 0.56 +/- 0.02 (z=6). We also compute the energy and magnetization in the
Bethe-Peierls approximation with a stochastic method, and estimate the
magnitude of replica symmetry breaking corrections. Near the phase transition,
we observe a sharp change of the median running time of our implementation of
the algorithm, consistent with a change from a polynomial dependence on the
system size, deep in the ferromagnetic phase, to slower than polynomial in the
spin-glass phase.Comment: 10 pages, RevTex, 10 eps figures. Some changes in the tex
- …