76 research outputs found
Measurement uncertainty relations for position and momentum: Relative entropy formulation
Heisenberg's uncertainty principle has recently led to general measurement
uncertainty relations for quantum systems: incompatible observables can be
measured jointly or in sequence only with some unavoidable approximation, which
can be quantified in various ways. The relative entropy is the natural
theoretical quantifier of the information loss when a `true' probability
distribution is replaced by an approximating one. In this paper, we provide a
lower bound for the amount of information that is lost by replacing the
distributions of the sharp position and momentum observables, as they could be
obtained with two separate experiments, by the marginals of any smeared joint
measurement. The bound is obtained by introducing an entropic error function,
and optimizing it over a suitable class of covariant approximate joint
measurements. We fully exploit two cases of target observables: (1)
-dimensional position and momentum vectors; (2) two components of position
and momentum along different directions. In (1), we connect the quantum bound
to the dimension ; in (2), going from parallel to orthogonal directions, we
show the transition from highly incompatible observables to compatible ones.
For simplicity, we develop the theory only for Gaussian states and
measurements.Comment: 33 page
Entropic measurement uncertainty relations for all the infinite components of a spin vector
The information-theoretic formulation of quantum measurement uncertainty
relations (MURs), based on the notion of relative entropy between measurement
probabilities, is extended to the set of all the spin components for a generic
spin s. For an approximate measurement of a spin vector, which gives
approximate joint measurements of the spin components, we define the device
information loss as the maximum loss of information per observable occurring in
approximating the ideal incompatible components with the joint measurement at
hand. By optimizing on the measuring device, we define the notion of minimum
information loss. By using these notions, we show how to give a significant
formulation of state independent MURs in the case of infinitely many target
observables. The same construction works as well for finitely many observables,
and we study the related MURs for two and three orthogonal spin components. The
minimum information loss plays also the role of measure of incompatibility and
in this respect it allows us to compare quantitatively the incompatibility of
various sets of spin observables, with different number of involved components
and different values of s.Comment: 33 page
Least singular value and condition number of a square random matrix with i.i.d. rows
We consider a square random matrix made by i.i.d. rows with any distribution
and prove that, for any given dimension, the probability for the least singular
value to be in [0; ) is at least of order . This allows us
to generalize a result about the expectation of the condition number that was
proved in the case of centered gaussian i.i.d. entries: such an expectation is
always infinite. Moreover, we get some additional results for some well-known
random matrix ensembles, in particular for the isotropic log-concave case,
which is proved to have the best behaving in terms of the well conditioning
- …