7,212 research outputs found
Power-Law distributions and Fisher's information measure
We show that thermodynamic uncertainties (TU) it preserve their form in
passing from Boltzmann-Gibbs' statistics to Tsallis' one provided that we
express these TU in terms of the appropriate variable conjugate to the
temperature in a nonextensive context.Comment: accepted for publication in Physica
Security of quantum bit string commitment depends on the information measure
Unconditionally secure non-relativistic bit commitment is known to be
impossible in both the classical and the quantum world. However, when
committing to a string of n bits at once, how far can we stretch the quantum
limits? In this letter, we introduce a framework of quantum schemes where Alice
commits a string of n bits to Bob, in such a way that she can only cheat on a
bits and Bob can learn at most b bits of information before the reveal phase.
Our results are two-fold: we show by an explicit construction that in the
traditional approach, where the reveal and guess probabilities form the
security criteria, no good schemes can exist: a+b is at least n. If, however,
we use a more liberal criterion of security, the accessible information, we
construct schemes where a=4 log n+O(1) and b=4, which is impossible
classically. Our findings significantly extend known no-go results for quantum
bit commitment.Comment: To appear in PRL. Short version of quant-ph/0504078, long version to
appear separately. Improved security definition and result, one new lemma
that may be of independent interest. v2: added funding reference, no other
change
Information measure for financial time series: quantifying short-term market heterogeneity
A well-interpretable measure of information has been recently proposed based
on a partition obtained by intersecting a random sequence with its moving
average. The partition yields disjoint sets of the sequence, which are then
ranked according to their size to form a probability distribution function and
finally fed in the expression of the Shannon entropy. In this work, such
entropy measure is implemented on the time series of prices and volatilities of
six financial markets. The analysis has been performed, on tick-by-tick data
sampled every minute for six years of data from 1999 to 2004, for a broad range
of moving average windows and volatility horizons. The study shows that the
entropy of the volatility series depends on the individual market, while the
entropy of the price series is practically a market-invariant for the six
markets. Finally, a cumulative information measure - the `Market Heterogeneity
Index'- is derived from the integral of the proposed entropy measure. The
values of the Market Heterogeneity Index are discussed as possible tools for
optimal portfolio construction and compared with those obtained by using the
Sharpe ratio a traditional risk diversity measure
Net Fisher information measure versus ionization potential and dipole polarizability in atoms
The net Fisher information measure, defined as the product of position and
momentum Fisher information measures and derived from the non-relativistic
Hartree-Fock wave functions for atoms with Z=1-102, is found to correlate well
with the inverse of the experimental ionization potential. Strong direct
correlations of the net Fisher information are also reported for the static
dipole polarizability of atoms with Z=1-88. The complexity measure, defined as
the ratio of the net Onicescu information measure and net Fisher information,
exhibits clearly marked regions corresponding to the periodicity of the atomic
shell structure. The reported correlations highlight the need for using the net
information measures in addition to either the position or momentum space
analogues. With reference to the correlation of the experimental properties
considered here, the net Fisher information measure is found to be superior
than the net Shannon information entropy.Comment: 16 pages, 6 figure
Unique additive information measures - Boltzmann-Gibbs-Shannon, Fisher and beyond
It is proved that the only additive and isotropic information measure that
can depend on the probability distribution and also on its first derivative is
a linear combination of the Boltzmann-Gibbs-Shannon and Fisher information
measures. Power law equilibrium distributions are found as a result of the
interaction of the two terms. The case of second order derivative dependence is
investigated and a corresponding additive information measure is given.Comment: 10 pages, 1 figures, shortene
- âŠ