812 research outputs found

    Station-Keeping Requirements for Constellations of Free-Flying Collectors Used for Astronomical Imaging in Space

    Full text link
    The accuracy requirements on station-keeping for constellations of free-flying collectors coupled as (future) imaging arrays in space for astrophysics applications are examined. The basic imaging element of these arrays is the two-element interferometer. Accurate knowledge of two quantities is required: the \textit{projected baseline length}, which is the distance between the two interferometer elements projected on the plane tranverse to the line of sight to the target; and the \textit{optical path difference}, which is the difference in the distances from that transverse plane to the beam combiner. ``Rules-of-thumb'' are determined for the typical accuracy required on these parameters. The requirement on the projected baseline length is a \textit{knowledge} requirement and depends on the angular size of the targets of interest; it is generally at a level of half a meter for typical stellar targets, decreasing to perhaps a few centimeters only for the widest attainable fields of view. The requirement on the optical path difference is a \textit{control} requirement and is much tighter, depending on the bandwidth of the signal; it is at a level of half a wavelength for narrow (few %) signal bands, decreasing to 0.2λ\approx 0.2 \lambda for the broadest bandwidths expected to be useful. Translation of these requirements into engineering requirements on station-keeping accuracy depends on the specific details of the collector constellation geometry. Several examples are provided to guide future application of the criteria presented here. Some implications for the design of such collector constellations and for the methods used to transform the information acquired into images are discussed.Comment: 13 pages, 6 figures, accepted 6/29/07 for the August 2007 issue of PAS

    Adaptive and non-adaptive group sequential tests

    Full text link
    Adaptive and non-adaptive group sequential test

    Meta-analyses and adaptive group sequential designs in the clinical development process

    Full text link
    Meta-analyses and adaptive group sequential designs in the clinical development proces

    Sequential Implementation of Monte Carlo Tests with Uniformly Bounded Resampling Risk

    Full text link
    This paper introduces an open-ended sequential algorithm for computing the p-value of a test using Monte Carlo simulation. It guarantees that the resampling risk, the probability of a different decision than the one based on the theoretical p-value, is uniformly bounded by an arbitrarily small constant. Previously suggested sequential or non-sequential algorithms, using a bounded sample size, do not have this property. Although the algorithm is open-ended, the expected number of steps is finite, except when the p-value is on the threshold between rejecting and not rejecting. The algorithm is suitable as standard for implementing tests that require (re-)sampling. It can also be used in other situations: to check whether a test is conservative, iteratively to implement double bootstrap tests, and to determine the sample size required for a certain power.Comment: Major Revision 15 pages, 4 figure

    Sequential methods for random-effects meta-analysis

    Get PDF
    Although meta-analyses are typically viewed as retrospective activities, they are increasingly being applied prospectively to provide up-to-date evidence on specific research questions. When meta-analyses are updated account should be taken of the possibility of false-positive findings due to repeated significance tests. We discuss the use of sequential methods for meta-analyses that incorporate random effects to allow for heterogeneity across studies. We propose a method that uses an approximate semi-Bayes procedure to update evidence on the among-study variance, starting with an informative prior distribution that might be based on findings from previous meta-analyses. We compare our methods with other approaches, including the traditional method of cumulative meta-analysis, in a simulation study and observe that it has Type I and Type II error rates close to the nominal level. We illustrate the method using an example in the treatment of bleeding peptic ulcers. Copyright © 2010 John Wiley & Sons, Ltd

    Calibration of Low-Frequency, Wide-Field Radio Interferometers Using Delay/Delay-Rate Filtering

    Full text link
    We present a filtering technique that can be applied to individual baselines of wide-bandwidth, wide-field interferometric data to geometrically select regions on the celestial sphere that contain primary calibration sources. The technique relies on the Fourier transformation of wide-band frequency spectra from a given baseline to obtain one-dimensional "delay images", and then the transformation of a time-series of delay images to obtain two-dimensional "delay/delay-rate images." Source selection is possible in these images given appropriate combinations of baseline, bandwidth, integration time and source location. Strong and persistent radio frequency interference (RFI) limits the effectiveness of this source selection owing to the removal of data by RFI excision algorithms. A one-dimensional, complex CLEAN algorithm has been developed to compensate for RFI-excision effects. This approach allows CLEANed, source-isolated data to be used to isolate bandpass and primary beam gain functions. These techniques are applied to data from the Precision Array for Probing the Epoch of Reionization (PAPER) as a demonstration of their value in calibrating a new generation of low-frequency radio interferometers with wide relative bandwidths and large fields-of-view.Comment: 17 pages, 6 figures, 2009AJ....138..219

    Comments on the tethered galaxy problem

    Full text link
    In a recent paper Davis et al. make the counter intuitive assertion that a galaxy held `tethered' at a fixed distance from our own could emit blueshifted light. Moreover, this effect may be derived from the simplest Friedmann-Robertson-Walker spacetimes and the (0.3,0.7) case which is believed to be a good late time model of our own universe. In this paper we recover the previous authors' results in a more transparent form. We show how their results rely on a choice of cosmological distance scale and revise the calculations in terms of observable quantities which are coordinate independent. By this method we see that, although such a tethering would reduce the redshift of a receding object, it would not do so sufficiently to cause the proposed blueshift. The effect is also demonstrated to be much smaller than conjectured below the largest intergalactic scales. We also discuss some important issues, raised by this scenario, relating to the interpretation of redshift and distance in relativistic cosmology.Comment: 6 pages, 3 figures, submitted to Am.J.Phy

    Displacement- and Timing-Noise Free Gravitational-Wave Detection

    Full text link
    Motivated by a recently-invented scheme of displacement-noise-free gravitational-wave detection, we demonstrate the existence of gravitational-wave detection schemes insusceptible to both displacement and timing (laser) noises, and are thus realizable by shot-noise-limited laser interferometry. This is possible due to two reasons: first, gravitational waves and displacement disturbances contribute to light propagation times in different manners; second, for an N-detector system, the number of signal channels is of the order O(N^2), while the total number of timing- and displacement-noise channels is of the order O(N).Comment: 4 pages, 3 figures; mistake correcte

    The Importance of Phase in Nulling Interferometry and a Three Telescope Closure-Phase Nulling Interferometer Concept

    Full text link
    We discuss the theory of the Bracewell nulling interferometer and explicitly demonstrate that the phase of the "white light" null fringe is the same as the phase of the bright output from an ordinary stellar interferometer. As a consequence a "closure phase" exists for a nulling interferometer with three or more telescopes. We calculate the phase offset as a function of baseline length for an Earth-like planet around the Sun at 10 pc, with a contrast ratio of 10610^{-6} at 10 μ\mum. The magnitude of the phase due to the planet is 106\sim 10^{-6} radians, assuming the star is at the phase center of the array. Although this is small, this phase may be observable in a three-telescope nulling interferometer that measures the closure phase. We propose a simple non-redundant three-telescope nulling interferometer that can perform this measurement. This configuration is expected to have improved characteristics compared to other nulling interferometer concepts, such as a relaxation of pathlength tolerances, through the use of the "ratio of wavelengths" technique, a closure phase, and better discrimination between exodiacal dust and planets

    Boundary crossing Random Walks, clinical trials and multinomial sequential estimation

    Get PDF
    A sufficient condition for the uniqueness of multinomial sequential unbiased estimators is provided generalizing a classical result for binomial samples. Unbiased estimators are applied to infer the parameters of multidimensional or multinomial Random Walks which are observed until they reach a boundary. An application to clinical trials is presented
    corecore