1,483 research outputs found
Sharp Bounds for Optimal Decoding of Low Density Parity Check Codes
Consider communication over a binary-input memoryless output-symmetric
channel with low density parity check (LDPC) codes and maximum a posteriori
(MAP) decoding. The replica method of spin glass theory allows to conjecture an
analytic formula for the average input-output conditional entropy per bit in
the infinite block length limit. Montanari proved a lower bound for this
entropy, in the case of LDPC ensembles with convex check degree polynomial,
which matches the replica formula. Here we extend this lower bound to any
irregular LDPC ensemble. The new feature of our work is an analysis of the
second derivative of the conditional input-output entropy with respect to
noise. A close relation arises between this second derivative and correlation
or mutual information of codebits. This allows us to extend the realm of the
interpolation method, in particular we show how channel symmetry allows to
control the fluctuations of the overlap parameters.Comment: 40 Pages, Submitted to IEEE Transactions on Information Theor
Bottleneck Problems: Information and Estimation-Theoretic View
Information bottleneck (IB) and privacy funnel (PF) are two closely related
optimization problems which have found applications in machine learning, design
of privacy algorithms, capacity problems (e.g., Mrs. Gerber's Lemma), strong
data processing inequalities, among others. In this work, we first investigate
the functional properties of IB and PF through a unified theoretical framework.
We then connect them to three information-theoretic coding problems, namely
hypothesis testing against independence, noisy source coding and dependence
dilution. Leveraging these connections, we prove a new cardinality bound for
the auxiliary variable in IB, making its computation more tractable for
discrete random variables.
In the second part, we introduce a general family of optimization problems,
termed as \textit{bottleneck problems}, by replacing mutual information in IB
and PF with other notions of mutual information, namely -information and
Arimoto's mutual information. We then argue that, unlike IB and PF, these
problems lead to easily interpretable guarantee in a variety of inference tasks
with statistical constraints on accuracy and privacy. Although the underlying
optimization problems are non-convex, we develop a technique to evaluate
bottleneck problems in closed form by equivalently expressing them in terms of
lower convex or upper concave envelope of certain functions. By applying this
technique to binary case, we derive closed form expressions for several
bottleneck problems
Generalization Bounds: Perspectives from Information Theory and PAC-Bayes
A fundamental question in theoretical machine learning is generalization.
Over the past decades, the PAC-Bayesian approach has been established as a
flexible framework to address the generalization capabilities of machine
learning algorithms, and design new ones. Recently, it has garnered increased
interest due to its potential applicability for a variety of learning
algorithms, including deep neural networks. In parallel, an
information-theoretic view of generalization has developed, wherein the
relation between generalization and various information measures has been
established. This framework is intimately connected to the PAC-Bayesian
approach, and a number of results have been independently discovered in both
strands. In this monograph, we highlight this strong connection and present a
unified treatment of generalization. We present techniques and results that the
two perspectives have in common, and discuss the approaches and interpretations
that differ. In particular, we demonstrate how many proofs in the area share a
modular structure, through which the underlying ideas can be intuited. We pay
special attention to the conditional mutual information (CMI) framework;
analytical studies of the information complexity of learning algorithms; and
the application of the proposed methods to deep learning. This monograph is
intended to provide a comprehensive introduction to information-theoretic
generalization bounds and their connection to PAC-Bayes, serving as a
foundation from which the most recent developments are accessible. It is aimed
broadly towards researchers with an interest in generalization and theoretical
machine learning.Comment: 222 page
- …