34,764 research outputs found
On finsler entropy of smooth distributions and Stefan-Sussman foliations
Using the definition of entropy of a family of increasing distances on a
compact metric set given in [10] we introduce a notion of Finsler entropy for
smooth distributions and Stefan-Sussmann foliations. This concept generalizes
most of classical topological entropy on a compact Riemannian manifold : the
entropy of a flow ([9]), of a regular foliation ([11]), of a regular
distribution ([5]) and of a geometrical structure ([22]). The essential results
of this paper is the nullity of the Finsler entropy for a controllable
distribution and for a singular Riemannian foliation
Entropy of geometric structures
We give a notion of entropy for general gemetric structures, which
generalizes well-known notions of topological entropy of vector fields and
geometric entropy of foliations, and which can also be applied to singular
objects, e.g. singular foliations, singular distributions, and Poisson
structures. We show some basic properties for this entropy, including the
\emph{additivity property}, analogous to the additivity of Clausius--Boltzmann
entropy in physics. In the case of Poisson structures, entropy is a new
invariant of dynamical nature, which is related to the transverse structure of
the characteristic foliation by symplectic leaves.Comment: The results of this paper were announced in a talk last year in IMPA,
Rio (Poisson 2010
Singular Value Decomposition and Entropy Dimension of Fractals
We analyze the singular value decomposition (SVD) and SVD entropy of Cantor
fractals produced by the Kronecker product. Our primary results show that SVD
entropy is a measure of image ``complexity dimension" that is invariant under
the number of Kronecker-product self-iterations (i.e., fractal order). SVD
entropy is therefore similar to the fractal Hausdorff complexity dimension but
suitable for characterizing fractal wave phenomena. Our field-based
normalization (Renyi entropy index = 1) illustrates the uncommon step-shaped
and cluster-patterned distributions of the fractal singular values and their
SVD entropy. As a modal measure of complexity, SVD entropy has uses for a
variety of wireless communication, free-space optical, and remote sensing
applications
Distributional Property Testing in a Quantum World
A fundamental problem in statistics and learning theory is to test properties of distributions. We show that quantum computers can solve such problems with significant speed-ups. We also introduce a novel access model for quantum distributions, enabling the coherent preparation of quantum samples, and propose a general framework that can naturally handle both classical and quantum distributions in a unified manner. Our framework generalizes and improves previous quantum algorithms for testing closeness between unknown distributions, testing independence between two distributions, and estimating the Shannon / von Neumann entropy of distributions. For classical distributions our algorithms significantly improve the precision dependence of some earlier results. We also show that in our framework procedures for classical distributions can be directly lifted to the more general case of quantum distributions, and thus obtain the first speed-ups for testing properties of density operators that can be accessed coherently rather than only via sampling
On the robustness of q-expectation values and Renyi entropy
We study the robustness of functionals of probability distributions such as
the R\'enyi and nonadditive S_q entropies, as well as the q-expectation values
under small variations of the distributions. We focus on three important types
of distribution functions, namely (i) continuous bounded (ii) discrete with
finite number of states, and (iii) discrete with infinite number of states. The
physical concept of robustness is contrasted with the mathematically stronger
condition of stability and Lesche-stability for functionals. We explicitly
demonstrate that, in the case of continuous distributions, once unbounded
distributions and those leading to negative entropy are excluded, both Renyi
and nonadditive S_q entropies as well as the q-expectation values are robust.
For the discrete finite case, the Renyi and nonadditive S_q entropies and the
q-expectation values are robust. For the infinite discrete case, where both
Renyi entropy and q-expectations are known to violate Lesche-stability and
stability respectively, we show that one can nevertheless state conditions
which guarantee physical robustness.Comment: 6 pages, to appear in Euro Phys Let
Optimal coding and the origins of Zipfian laws
The problem of compression in standard information theory consists of
assigning codes as short as possible to numbers. Here we consider the problem
of optimal coding -- under an arbitrary coding scheme -- and show that it
predicts Zipf's law of abbreviation, namely a tendency in natural languages for
more frequent words to be shorter. We apply this result to investigate optimal
coding also under so-called non-singular coding, a scheme where unique
segmentation is not warranted but codes stand for a distinct number. Optimal
non-singular coding predicts that the length of a word should grow
approximately as the logarithm of its frequency rank, which is again consistent
with Zipf's law of abbreviation. Optimal non-singular coding in combination
with the maximum entropy principle also predicts Zipf's rank-frequency
distribution. Furthermore, our findings on optimal non-singular coding
challenge common beliefs about random typing. It turns out that random typing
is in fact an optimal coding process, in stark contrast with the common
assumption that it is detached from cost cutting considerations. Finally, we
discuss the implications of optimal coding for the construction of a compact
theory of Zipfian laws and other linguistic laws.Comment: in press in the Journal of Quantitative Linguistics; definition of
concordant pair corrected, proofs polished, references update
Complex-Valued Random Vectors and Channels: Entropy, Divergence, and Capacity
Recent research has demonstrated significant achievable performance gains by
exploiting circularity/non-circularity or propeness/improperness of
complex-valued signals. In this paper, we investigate the influence of these
properties on important information theoretic quantities such as entropy,
divergence, and capacity. We prove two maximum entropy theorems that strengthen
previously known results. The proof of the former theorem is based on the
so-called circular analog of a given complex-valued random vector. Its
introduction is supported by a characterization theorem that employs a minimum
Kullback-Leibler divergence criterion. In the proof of latter theorem, on the
other hand, results about the second-order structure of complex-valued random
vectors are exploited. Furthermore, we address the capacity of multiple-input
multiple-output (MIMO) channels. Regardless of the specific distribution of the
channel parameters (noise vector and channel matrix, if modeled as random), we
show that the capacity-achieving input vector is circular for a broad range of
MIMO channels (including coherent and noncoherent scenarios). Finally, we
investigate the situation of an improper and Gaussian distributed noise vector.
We compute both capacity and capacity-achieving input vector and show that
improperness increases capacity, provided that the complementary covariance
matrix is exploited. Otherwise, a capacity loss occurs, for which we derive an
explicit expression.Comment: 33 pages, 1 figure, slightly modified version of first paper revision
submitted to IEEE Trans. Inf. Theory on October 31, 201
- …