29,598 research outputs found
Linear Information Coupling Problems
Many network information theory problems face the similar difficulty of
single letterization. We argue that this is due to the lack of a geometric
structure on the space of probability distribution. In this paper, we develop
such a structure by assuming that the distributions of interest are close to
each other. Under this assumption, the K-L divergence is reduced to the squared
Euclidean metric in an Euclidean space. Moreover, we construct the notion of
coordinate and inner product, which will facilitate solving communication
problems. We will also present the application of this approach to the
point-to-point channel and the general broadcast channel, which demonstrates
how our technique simplifies information theory problems.Comment: To appear, IEEE International Symposium on Information Theory, July,
201
The geometry of the Toda equation
I show that solutions of the SU(infinity) Toda field equation generating a
fixed Einstein-Weyl space are governed by a linear equation on the
Einstein-Weyl space. From this, obstructions to the existence of Toda solutions
generating a given Einstein-Weyl space are found. I also give a classification
of Einstein-Weyl spaces arising from the Toda equation in more than one way.
This classification coincides with a class of spaces found by Ward and hence
clarifies some of their properties. I end by discussing the simplest examples.Comment: AMS-LaTeX 11 pages; minor changes to title, keywords and reference
Supersymmetric Quantum Mechanics and Super-Lichnerowicz Algebras
We present supersymmetric, curved space, quantum mechanical models based on
deformations of a parabolic subalgebra of osp(2p+2|Q). The dynamics are
governed by a spinning particle action whose internal coordinates are Lorentz
vectors labeled by the fundamental representation of osp(2p|Q). The states of
the theory are tensors or spinor-tensors on the curved background while
conserved charges correspond to the various differential geometry operators
acting on these. The Hamiltonian generalizes Lichnerowicz's wave/Laplace
operator. It is central, and the models are supersymmetric whenever the
background is a symmetric space, although there is an osp(2p|Q) superalgebra
for any curved background. The lowest purely bosonic example (2p,Q)=(2,0)
corresponds to a deformed Jacobi group and describes Lichnerowicz's original
algebra of constant curvature, differential geometric operators acting on
symmetric tensors. The case (2p,Q)=(0,1) is simply the {\cal N}=1 superparticle
whose supercharge amounts to the Dirac operator acting on spinors. The
(2p,Q)=(0,2) model is the {\cal N}=2 supersymmetric quantum mechanics
corresponding to differential forms. (This latter pair of models are
supersymmetric on any Riemannian background.) When Q is odd, the models apply
to spinor-tensors. The (2p,Q)=(2,1) model is distinguished by admitting a
central Lichnerowicz-Dirac operator when the background is constant curvature.
The new supersymmetric models are novel in that the Hamiltonian is not just a
square of super charges, but rather a sum of commutators of supercharges and
commutators of bosonic charges. These models and superalgebras are a very
useful tool for any study involving high rank tensors and spinors on manifolds.Comment: 39 pages, LaTeX, fixed typos, added refs, final version to appear in
CM
The Linear Information Coupling Problems
Many network information theory problems face the similar difficulty of
single-letterization. We argue that this is due to the lack of a geometric
structure on the space of probability distribution. In this paper, we develop
such a structure by assuming that the distributions of interest are close to
each other. Under this assumption, the K-L divergence is reduced to the squared
Euclidean metric in an Euclidean space. In addition, we construct the notion of
coordinate and inner product, which will facilitate solving communication
problems. We will present the application of this approach to the
point-to-point channel, general broadcast channel, and the multiple access
channel (MAC) with the common source. It can be shown that with this approach,
information theory problems, such as the single-letterization, can be reduced
to some linear algebra problems. Moreover, we show that for the general
broadcast channel, transmitting the common message to receivers can be
formulated as the trade-off between linear systems. We also provide an example
to visualize this trade-off in a geometric way. Finally, for the MAC with the
common source, we observe a coherent combining gain due to the cooperation
between transmitters, and this gain can be quantified by applying our
technique.Comment: 27 pages, submitted to IEEE Transactions on Information Theor
Recent Advances in Transfer Learning for Cross-Dataset Visual Recognition: A Problem-Oriented Perspective
This paper takes a problem-oriented perspective and presents a comprehensive
review of transfer learning methods, both shallow and deep, for cross-dataset
visual recognition. Specifically, it categorises the cross-dataset recognition
into seventeen problems based on a set of carefully chosen data and label
attributes. Such a problem-oriented taxonomy has allowed us to examine how
different transfer learning approaches tackle each problem and how well each
problem has been researched to date. The comprehensive problem-oriented review
of the advances in transfer learning with respect to the problem has not only
revealed the challenges in transfer learning for visual recognition, but also
the problems (e.g. eight of the seventeen problems) that have been scarcely
studied. This survey not only presents an up-to-date technical review for
researchers, but also a systematic approach and a reference for a machine
learning practitioner to categorise a real problem and to look up for a
possible solution accordingly
Generalized Bregman Divergence and Gradient of Mutual Information for Vector Poisson Channels
We investigate connections between information-theoretic and
estimation-theoretic quantities in vector Poisson channel models. In
particular, we generalize the gradient of mutual information with respect to
key system parameters from the scalar to the vector Poisson channel model. We
also propose, as another contribution, a generalization of the classical
Bregman divergence that offers a means to encapsulate under a unifying
framework the gradient of mutual information results for scalar and vector
Poisson and Gaussian channel models. The so-called generalized Bregman
divergence is also shown to exhibit various properties akin to the properties
of the classical version. The vector Poisson channel model is drawing
considerable attention in view of its application in various domains: as an
example, the availability of the gradient of mutual information can be used in
conjunction with gradient descent methods to effect compressive-sensing
projection designs in emerging X-ray and document classification applications
- …