1,492 research outputs found
A survey of uncertainty principles and some signal processing applications
The goal of this paper is to review the main trends in the domain of
uncertainty principles and localization, emphasize their mutual connections and
investigate practical consequences. The discussion is strongly oriented
towards, and motivated by signal processing problems, from which significant
advances have been made recently. Relations with sparse approximation and
coding problems are emphasized
Measurement uncertainty relations
Measurement uncertainty relations are quantitative bounds on the errors in an
approximate joint measurement of two observables. They can be seen as a
generalization of the error/disturbance tradeoff first discussed heuristically
by Heisenberg. Here we prove such relations for the case of two canonically
conjugate observables like position and momentum, and establish a close
connection with the more familiar preparation uncertainty relations
constraining the sharpness of the distributions of the two observables in the
same state. Both sets of relations are generalized to means of order
rather than the usual quadratic means, and we show that the optimal constants
are the same for preparation and for measurement uncertainty. The constants are
determined numerically and compared with some bounds in the literature. In both
cases the near-saturation of the inequalities entails that the state (resp.
observable) is uniformly close to a minimizing one.Comment: This version 2 contains minor corrections and reformulation
Continuous-variable entropic uncertainty relations
Uncertainty relations are central to quantum physics. While they were
originally formulated in terms of variances, they have later been successfully
expressed with entropies following the advent of Shannon information theory.
Here, we review recent results on entropic uncertainty relations involving
continuous variables, such as position and momentum . This includes the
generalization to arbitrary (not necessarily canonically-conjugate) variables
as well as entropic uncertainty relations that take - correlations into
account and admit all Gaussian pure states as minimum uncertainty states. We
emphasize that these continuous-variable uncertainty relations can be
conveniently reformulated in terms of entropy power, a central quantity in the
information-theoretic description of random signals, which makes a bridge with
variance-based uncertainty relations. In this review, we take the quantum
optics viewpoint and consider uncertainties on the amplitude and phase
quadratures of the electromagnetic field, which are isomorphic to and ,
but the formalism applies to all such variables (and linear combinations
thereof) regardless of their physical meaning. Then, in the second part of this
paper, we move on to new results and introduce a tighter entropic uncertainty
relation for two arbitrary vectors of intercommuting continuous variables that
take correlations into account. It is proven conditionally on reasonable
assumptions. Finally, we present some conjectures for new entropic uncertainty
relations involving more than two continuous variables.Comment: Review paper, 42 pages, 1 figure. We corrected some minor errors in
V
Hirschman optimal transform least mean square adaptive filters.
Abstract not available
Entropic Uncertainty Relations in Quantum Physics
Uncertainty relations have become the trademark of quantum theory since they
were formulated by Bohr and Heisenberg. This review covers various
generalizations and extensions of the uncertainty relations in quantum theory
that involve the R\'enyi and the Shannon entropies. The advantages of these
entropic uncertainty relations are pointed out and their more direct connection
to the observed phenomena is emphasized. Several remaining open problems are
mentionedComment: 35 pages, review pape
Measurement uncertainty relations for position and momentum: Relative entropy formulation
Heisenberg's uncertainty principle has recently led to general measurement
uncertainty relations for quantum systems: incompatible observables can be
measured jointly or in sequence only with some unavoidable approximation, which
can be quantified in various ways. The relative entropy is the natural
theoretical quantifier of the information loss when a `true' probability
distribution is replaced by an approximating one. In this paper, we provide a
lower bound for the amount of information that is lost by replacing the
distributions of the sharp position and momentum observables, as they could be
obtained with two separate experiments, by the marginals of any smeared joint
measurement. The bound is obtained by introducing an entropic error function,
and optimizing it over a suitable class of covariant approximate joint
measurements. We fully exploit two cases of target observables: (1)
-dimensional position and momentum vectors; (2) two components of position
and momentum along different directions. In (1), we connect the quantum bound
to the dimension ; in (2), going from parallel to orthogonal directions, we
show the transition from highly incompatible observables to compatible ones.
For simplicity, we develop the theory only for Gaussian states and
measurements.Comment: 33 page
- …