257,722 research outputs found
Testing Reactive Probabilistic Processes
We define a testing equivalence in the spirit of De Nicola and Hennessy for
reactive probabilistic processes, i.e. for processes where the internal
nondeterminism is due to random behaviour. We characterize the testing
equivalence in terms of ready-traces. From the characterization it follows that
the equivalence is insensitive to the exact moment in time in which an internal
probabilistic choice occurs, which is inherent from the original testing
equivalence of De Nicola and Hennessy. We also show decidability of the testing
equivalence for finite systems for which the complete model may not be known
Optimal testing of equivalence hypotheses
In this paper we consider the construction of optimal tests of equivalence
hypotheses. Specifically, assume X_1,..., X_n are i.i.d. with distribution
P_{\theta}, with \theta \in R^k. Let g(\theta) be some real-valued parameter of
interest. The null hypothesis asserts g(\theta)\notin (a,b) versus the
alternative g(\theta)\in (a,b). For example, such hypotheses occur in
bioequivalence studies where one may wish to show two drugs, a brand name and a
proposed generic version, have the same therapeutic effect. Little optimal
theory is available for such testing problems, and it is the purpose of this
paper to provide an asymptotic optimality theory. Thus, we provide asymptotic
upper bounds for what is achievable, as well as asymptotically uniformly most
powerful test constructions that attain the bounds. The asymptotic theory is
based on Le Cam's notion of asymptotically normal experiments. In order to
approximate a general problem by a limiting normal problem, a UMP equivalence
test is obtained for testing the mean of a multivariate normal mean.Comment: Published at http://dx.doi.org/10.1214/009053605000000048 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
On the Complexity of Nondeterministically Testable Hypergraph Parameters
The paper proves the equivalence of the notions of nondeterministic and
deterministic parameter testing for uniform dense hypergraphs of arbitrary
order. It generalizes the result previously known only for the case of simple
graphs. By a similar method we establish also the equivalence between
nondeterministic and deterministic hypergraph property testing, answering the
open problem in the area. We introduce a new notion of a cut norm for
hypergraphs of higher order, and employ regularity techniques combined with the
ultralimit method.Comment: 33 page
Full abstraction for fair testing in CCS (expanded version)
In previous work with Pous, we defined a semantics for CCS which may both be
viewed as an innocent form of presheaf semantics and as a concurrent form of
game semantics. We define in this setting an analogue of fair testing
equivalence, which we prove fully abstract w.r.t. standard fair testing
equivalence. The proof relies on a new algebraic notion called playground,
which represents the `rule of the game'. From any playground, we derive two
languages equipped with labelled transition systems, as well as a strong,
functional bisimulation between them.Comment: 80 page
Testing Einstein's Weak Equivalence Principle With Gravitational Waves
A conservative constraint on the Einstein Weak Equivalence Principle (WEP)
can be obtained under the assumption that the observed time delay between
correlated particles from astronomical sources is dominated by the
gravitational fields through which they move. Current limits on the WEP are
mainly based on the observed time delays of photons with different energies. It
is highly desirable to develop more accurate tests that include the
gravitational wave (GW) sector. The detection by the advanced LIGO/VIRGO
systems of gravitational waves will provide attractive candidates for
constraining the WEP, extending the tests to gravitational interactions, with
potentially higher accuracy. Considering the capabilities of the advanced
LIGO/VIRGO network and the source direction uncertainty, we show that the joint
detection of GWs and electromagnetic signals could probe the WEP to an accuracy
down to , which is one order of magnitude tighter than previous
limits, and seven orders of magnitude tighter than the multi-messenger (photons
and neutrinos) results by supernova 1987A.Comment: Accepted for publication in PR
Testing for equivalence: an intersection-union permutation solution
The notion of testing for equivalence of two treatments is widely used in
clinical trials, pharmaceutical experiments,bioequivalence and quality control.
It is essentially approached within the intersection-union (IU) principle.
According to this principle the null hypothesis is stated as the set of effects
lying outside a suitably established interval and the alternative as the set of
effects lying inside that interval. The solutions provided in the literature
are mostly based on likelihood techniques, which in turn are rather difficult
to handle, except for cases lying within the regular exponential family and the
invariance principle. The main goal of present paper is to go beyond most of
the limitations of likelihood based methods, i.e. to work in a nonparametric
setting within the permutation frame. To obtain practical solutions, a new IU
permutation test is presented and discussed. A simple simulation study for
evaluating its main properties, and three application examples are also
presented.Comment: 21 pages, 2 figure
- …
