6 research outputs found
Satisfiability Thresholds for Regular Occupation Problems
In the last two decades the study of random instances of constraint
satisfaction problems (CSPs) has flourished across several disciplines,
including computer science, mathematics and physics. The diversity of the
developed methods, on the rigorous and non-rigorous side, has led to major
advances regarding both the theoretical as well as the applied viewpoints. The
two most popular types of such CSPs are the Erd\H{o}s-R\'enyi and the random
regular CSPs.
Based on a ceteris paribus approach in terms of the density evolution
equations known from statistical physics, we focus on a specific prominent
class of problems of the latter type, the so-called occupation problems. The
regular -in- occupation problems resemble a basis of this class. By now,
out of these CSPs only the satisfiability threshold - the largest degree for
which the problem admits asymptotically a solution - for the -in-
occupation problem has been rigorously established. In the present work we take
a general approach towards a systematic analysis of occupation problems. In
particular, we discover a surprising and explicit connection between the
-in- occupation problem satisfiability threshold and the determination of
contraction coefficients, an important quantity in information theory measuring
the loss of information that occurs when communicating through a noisy channel.
We present methods to facilitate the computation of these coefficients and use
them to establish explicitly the threshold for the -in- occupation
problem for . Based on this result, for general we formulate a
conjecture that pins down the exact value of the corresponding coefficient,
which, if true, is shown to determine the threshold in all these cases
Mutual Information, Information-Theoretic Thresholds and the Condensation Phenomenon at Positive Temperature
There is a vast body of recent literature on the reliability of communication
through noisy channels, the recovery of community structures in the stochastic
block model, the limiting behavior of the free entropy in spin glasses and the
solution space structure of constraint satisfaction problems. At first glance,
these topics ranging across several disciplines might seem unrelated. However,
taking a closer look, structural similarities can be easily identified.
Factor graphs exploit these similarities to model the aforementioned objects
and concepts in a unified manner. In this contribution we discuss the
asymptotic average case behavior of several quantities, where the average is
taken over sparse Erd\H{o}s-R\'enyi type (hyper-) graphs with positive weights,
under certain assumptions. For one, we establish the limit of the mutual
information, which is used in coding theory to measure the reliability of
communication. We also determine the limit of the relative entropy, which can
be used to decide if weak recovery is possible in the stochastic block model.
Further, we prove the conjectured limit of the quenched free entropy over the
planted ensemble, which we use to obtain the preceding limits. Finally, we
describe the asymptotic behavior of the quenched free entropy (over the null
model) in terms of the limiting relative entropy.Comment: 95 page
The hitting time of clique factors
In a recent paper, Kahn gave the strongest possible, affirmative, answer to
Shamir's problem, which had been open since the late 1970s: Let and
let be divisible by . Then, in the random -uniform hypergraph process
on vertices, as soon as the last isolated vertex disappears, a perfect
matching emerges. In the present work, we transfer this hitting time result to
the setting of clique factors in the random graph process: At the time that the
last vertex joins a copy of the complete graph , the random graph process
contains a -factor. Our proof draws on a novel sequence of couplings,
extending techniques of Riordan and the first author. An analogous result is
proved for clique factors in the -uniform hypergraph process ()
Inference and Mutual Information on Random Factor Graphs
Random factor graphs provide a powerful framework for the study of inference problems such as decoding problems or the stochastic block model. Information-theoretically the key quantity of interest is the mutual information between the observed factor graph and the underlying ground truth around which the factor graph was created; in the stochastic block model, this would be the planted partition. The mutual information gauges whether and how well the ground truth can be inferred from the observable data. For a very general model of random factor graphs we verify a formula for the mutual information predicted by physics techniques. As an application we prove a conjecture about low-density generator matrix codes from [Montanari: IEEE Transactions on Information Theory 2005]. Further applications include phase transitions of the stochastic block model and the mixed k-spin model from physics