823 research outputs found
Critique of `Elements of Quantum Probability'
We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment
in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension of classical probability known as Quantum Probability We show that their argument depends on hidden assumptions and a highly restrictive view of the scope of classical probability A careful probabilistic analysis shows
on the contrary
that it is classical deterministic physical thinking which cannot cope with the Aspect experiment and therefore needs revision The ulterior aim of the paper is to help mathematical statisticians and probabilists to
nd their way into the fascinating world of quantum probability thus the same aim as that of Kummerer and Maassen by dismantling the bamboo curtain between ordinary and quantum probability which over the years has been built up as physicists and pure mathematicians have repeated to one another Feynmans famous dictum quantum probability is a dierent kind of probabilit
Nonparametric estimation under censoring and passive registration
The classical random censorship model assumes that we follow an individual continuously up to the time of failure or censoring so observing this time as well as the indicator of its type Under passive registration we only get information on the state of the individual at random observation or registration times In this paper we assume that these registration times are the times of events in an independent Poisson process stopped at failure or censoring the time of failure is also observed if not censored This problem turns up in historical demography where the survival time of interest is the lifelength censoring
is by emigration and the observation times are times of births of children and other lifeevents Church registers contain dates of births marriages deaths but not emigrations The model is shown to be related to the problem of estimating a density known to be monotone This leads to an explicit description of the non-parametric maximum likelihood estimator of the survival function based on iid-observations from this model and to an analysis of its large sample propertie
Asymptotics in quantum statistics
Observations or measurements taken of a quantum system (a small number of fundamental particles) are inherently random. If the state of the system depends on unknown parameters, then the distribution of the outcome depends on these parameters too, and statistical inference problems result. Often one has a choice of what measurement to take, corresponding to dierent experimental set-ups or settings of measurement apparatus. This leads to a design problem|which measurement is best for a given statistical problem. This paper gives an introduction to this eld in the most simple of settings, that of estimating the state of a spin-half particle given n independent copies of the particle. We show how in some cases asymptotically optimal measurements can be constructed. Other cases present interesting open problems, connected to the fact that for some models, quantum Fisher information is in some sense non-additive. In physical terms, we have non-locality
without entanglement
Product integration
This is a brief survey of product-integration for biostatisticians
Accardi contra Bell (cum mundi) : the impossible coupling
An experimentally observed violation of Bell s inequality is supposed to show the failure
of local realism to deal with quantum reality. However, finite statistics and the time
sequential nature of real experiments still allows a loophole for local realism. We show
that the randomised design of the Aspect experiment closes this loophole. Our main tool
is van de Geer s (2000) supermartingale version of the classical Bernstein (1924) inequality
guaranteeing, at the root n scale, a not-heavier-than-Gaussian tail of the distribution of
a sum of bounded supermartingale differences. The results are used to specify a protocol
for a public bet between the author and L. Accardi, who in recent papers (Accardi and
Regoli, 2000a,b, 2001) has claimed to have produced a suite of computer programmes, to
be run on a network of computers, which will simulate a violation of Bell s inequalites. At
a sample size of thirty thousand, both error probabilities are guaranteed smaller than one
in a million, provided we adhere to the sequential randomized design
- …