696 research outputs found
Meta-analyses and adaptive group sequential designs in the clinical development process
Meta-analyses and adaptive group sequential designs in the clinical development proces
Sequential Implementation of Monte Carlo Tests with Uniformly Bounded Resampling Risk
This paper introduces an open-ended sequential algorithm for computing the
p-value of a test using Monte Carlo simulation. It guarantees that the
resampling risk, the probability of a different decision than the one based on
the theoretical p-value, is uniformly bounded by an arbitrarily small constant.
Previously suggested sequential or non-sequential algorithms, using a bounded
sample size, do not have this property. Although the algorithm is open-ended,
the expected number of steps is finite, except when the p-value is on the
threshold between rejecting and not rejecting. The algorithm is suitable as
standard for implementing tests that require (re-)sampling. It can also be used
in other situations: to check whether a test is conservative, iteratively to
implement double bootstrap tests, and to determine the sample size required for
a certain power.Comment: Major Revision 15 pages, 4 figure
Sequential methods for random-effects meta-analysis
Although meta-analyses are typically viewed as retrospective activities, they are increasingly being applied prospectively to provide up-to-date evidence on specific research questions. When meta-analyses are updated account should be taken of the possibility of false-positive findings due to repeated significance tests. We discuss the use of sequential methods for meta-analyses that incorporate random effects to allow for heterogeneity across studies. We propose a method that uses an approximate semi-Bayes procedure to update evidence on the among-study variance, starting with an informative prior distribution that might be based on findings from previous meta-analyses. We compare our methods with other approaches, including the traditional method of cumulative meta-analysis, in a simulation study and observe that it has Type I and Type II error rates close to the nominal level. We illustrate the method using an example in the treatment of bleeding peptic ulcers. Copyright © 2010 John Wiley & Sons, Ltd
Calibration of Low-Frequency, Wide-Field Radio Interferometers Using Delay/Delay-Rate Filtering
We present a filtering technique that can be applied to individual baselines
of wide-bandwidth, wide-field interferometric data to geometrically select
regions on the celestial sphere that contain primary calibration sources. The
technique relies on the Fourier transformation of wide-band frequency spectra
from a given baseline to obtain one-dimensional "delay images", and then the
transformation of a time-series of delay images to obtain two-dimensional
"delay/delay-rate images." Source selection is possible in these images given
appropriate combinations of baseline, bandwidth, integration time and source
location. Strong and persistent radio frequency interference (RFI) limits the
effectiveness of this source selection owing to the removal of data by RFI
excision algorithms. A one-dimensional, complex CLEAN algorithm has been
developed to compensate for RFI-excision effects. This approach allows CLEANed,
source-isolated data to be used to isolate bandpass and primary beam gain
functions. These techniques are applied to data from the Precision Array for
Probing the Epoch of Reionization (PAPER) as a demonstration of their value in
calibrating a new generation of low-frequency radio interferometers with wide
relative bandwidths and large fields-of-view.Comment: 17 pages, 6 figures, 2009AJ....138..219
Comments on the tethered galaxy problem
In a recent paper Davis et al. make the counter intuitive assertion that a
galaxy held `tethered' at a fixed distance from our own could emit blueshifted
light. Moreover, this effect may be derived from the simplest
Friedmann-Robertson-Walker spacetimes and the (0.3,0.7) case which is believed
to be a good late time model of our own universe.
In this paper we recover the previous authors' results in a more transparent
form. We show how their results rely on a choice of cosmological distance scale
and revise the calculations in terms of observable quantities which are
coordinate independent. By this method we see that, although such a tethering
would reduce the redshift of a receding object, it would not do so sufficiently
to cause the proposed blueshift. The effect is also demonstrated to be much
smaller than conjectured below the largest intergalactic scales. We also
discuss some important issues, raised by this scenario, relating to the
interpretation of redshift and distance in relativistic cosmology.Comment: 6 pages, 3 figures, submitted to Am.J.Phy
Displacement- and Timing-Noise Free Gravitational-Wave Detection
Motivated by a recently-invented scheme of displacement-noise-free
gravitational-wave detection, we demonstrate the existence of
gravitational-wave detection schemes insusceptible to both displacement and
timing (laser) noises, and are thus realizable by shot-noise-limited laser
interferometry. This is possible due to two reasons: first, gravitational waves
and displacement disturbances contribute to light propagation times in
different manners; second, for an N-detector system, the number of signal
channels is of the order O(N^2), while the total number of timing- and
displacement-noise channels is of the order O(N).Comment: 4 pages, 3 figures; mistake correcte
The Importance of Phase in Nulling Interferometry and a Three Telescope Closure-Phase Nulling Interferometer Concept
We discuss the theory of the Bracewell nulling interferometer and explicitly
demonstrate that the phase of the "white light" null fringe is the same as the
phase of the bright output from an ordinary stellar interferometer. As a
consequence a "closure phase" exists for a nulling interferometer with three or
more telescopes. We calculate the phase offset as a function of baseline length
for an Earth-like planet around the Sun at 10 pc, with a contrast ratio of
at 10 m. The magnitude of the phase due to the planet is radians, assuming the star is at the phase center of the array.
Although this is small, this phase may be observable in a three-telescope
nulling interferometer that measures the closure phase. We propose a simple
non-redundant three-telescope nulling interferometer that can perform this
measurement. This configuration is expected to have improved characteristics
compared to other nulling interferometer concepts, such as a relaxation of
pathlength tolerances, through the use of the "ratio of wavelengths" technique,
a closure phase, and better discrimination between exodiacal dust and planets
Boundary crossing Random Walks, clinical trials and multinomial sequential estimation
A sufficient condition for the uniqueness of multinomial sequential unbiased
estimators is provided generalizing a classical result for binomial samples.
Unbiased estimators are applied to infer the parameters of multidimensional or
multinomial Random Walks which are observed until they reach a boundary. An
application to clinical trials is presented
Optimal sequential fingerprinting: Wald vs. Tardos
We study sequential collusion-resistant fingerprinting, where the
fingerprinting code is generated in advance but accusations may be made between
rounds, and show that in this setting both the dynamic Tardos scheme and
schemes building upon Wald's sequential probability ratio test (SPRT) are
asymptotically optimal. We further compare these two approaches to sequential
fingerprinting, highlighting differences between the two schemes. Based on
these differences, we argue that Wald's scheme should in general be preferred
over the dynamic Tardos scheme, even though both schemes have their merits. As
a side result, we derive an optimal sequential group testing method for the
classical model, which can easily be generalized to different group testing
models.Comment: 12 pages, 10 figure
- …