1,474 research outputs found

    MObile Technology for Improved Family Planning: update to randomised controlled trial protocol.

    Get PDF
    BACKGROUND: This update outlines changes to the MObile Technology for Improved Family Planning study statistical analysis plan and plans for long-term follow-up. These changes result from obtaining additional funding and the decision to restrict the primary analysis to participants with available follow-up data. The changes were agreed prior to finalising the statistical analysis plan and sealing the dataset. METHODS/DESIGN: The primary analysis will now be restricted to subjects with data on the primary outcome at 4-month follow-up. The extreme-case scenario, where all those lost to follow-up are counted as non-adherent, will be used in a sensitivity analysis. In addition to the secondary outcomes outlined in the protocol, we will assess the effect of the intervention on long-acting contraception (implant, intra-uterine device and permanent methods).To assess the long-term effect of the intervention, we plan to conduct additional 12-month follow-up by telephone self-report for all the primary and secondary outcomes used at 4 months. All participants provided informed consent for this additional follow-up when recruited to the trial. Outcome measures and analysis at 12 months will be similar to those at the 4-month follow-up. The primary outcomes of the trial will be the use of an effective modern contraceptive method at 4 months and at 12 months post-abortion. Secondary outcomes will include long-acting contraception use, self-reported pregnancy, repeat abortion and contraception use over the 12-month post-abortion period. DISCUSSION: Restricting the primary analysis to those with follow-up data is the standard approach for trial analysis and will facilitate comparison with other trials of interventions designed to increase contraception uptake or use. Undertaking 12-month trial follow-up will allow us to evaluate the long-term effect of the intervention. TRIAL REGISTRATION: ClinicalTrials.gov NCT01823861

    Random template banks and relaxed lattice coverings

    Full text link
    Template-based searches for gravitational waves are often limited by the computational cost associated with searching large parameter spaces. The study of efficient template banks, in the sense of using the smallest number of templates, is therefore of great practical interest. The "traditional" approach to template-bank construction requires every point in parameter space to be covered by at least one template, which rapidly becomes inefficient at higher dimensions. Here we study an alternative approach, where any point in parameter space is covered only with a given probability < 1. We find that by giving up complete coverage in this way, large reductions in the number of templates are possible, especially at higher dimensions. The prime examples studied here are "random template banks", in which templates are placed randomly with uniform probability over the parameter space. In addition to its obvious simplicity, this method turns out to be surprisingly efficient. We analyze the statistical properties of such random template banks, and compare their efficiency to traditional lattice coverings. We further study "relaxed" lattice coverings (using Zn and An* lattices), which similarly cover any signal location only with probability < 1. The relaxed An* lattice is found to yield the most efficient template banks at low dimensions (n < 10), while random template banks increasingly outperform any other method at higher dimensions.Comment: 13 pages, 10 figures, submitted to PR

    Generalised gravitational burst generation with Generative Adversarial Networks

    Get PDF
    We introduce the use of conditional generative adversarial networks forgeneralised gravitational wave burst generation in the time domain.Generativeadversarial networks are generative machine learning models that produce new databased on the features of the training data set. We condition the network on fiveclasses of time-series signals that are often used to characterise gravitational waveburst searches: sine-Gaussian, ringdown, white noise burst, Gaussian pulse and binaryblack hole merger. We show that the model can replicate the features of these standardsignal classes and, in addition, produce generalised burst signals through interpolationand class mixing. We also present an example application where a convolutional neuralnetwork classifier is trained on burst signals generated by our conditional generativeadversarial network. We show that a convolutional neural network classifier trainedonly on the standard five signal classes has a poorer detection efficiency than aconvolutional neural network classifier trained on a population of generalised burstsignals drawn from the combined signal class space

    The very faint X-ray binary IGR J17062-6143: a truncated disc, no pulsations, and a possible outflow

    Get PDF
    We present a comprehensive X-ray study of the neutron star low-mass X-ray binary IGR J17062-6143, which has been accreting at low luminosities since its discovery in 2006. Analysing NuSTAR, XMM–Newton, and Swift observations, we investigate the very faint nature of this source through three approaches: modelling the relativistic reflection spectrum to constrain the accretion geometry, performing high-resolution X-ray spectroscopy to search for an outflow, and searching for the recently reported millisecond X-ray pulsations. We find a strongly truncated accretion disc at 77+22−18 gravitational radii (∼164 km) assuming a high inclination, although a low inclination and a disc extending to the neutron star cannot be excluded. The high-resolution spectroscopy reveals evidence for oxygen-rich circumbinary material, possibly resulting from a blueshifted, collisionally ionized outflow. Finally, we do not detect any pulsations. We discuss these results in the broader context of possible explanations for the persistent faint nature of weakly accreting neutron stars. The results are consistent with both an ultra-compact binary orbit and a magnetically truncated accretion flow, although both cannot be unambiguously inferred. We also discuss the nature of the donor star and conclude that it is likely a CO or O–Ne–Mg white dwarf, consistent with recent multiwavelength modelling

    Advances in Small Particle Handling of Astromaterials in Preparation for OSIRIS-REx and Hayabusa2: Initial Developments

    Get PDF
    The Astromaterials Acquisition and Curation office at NASA Johnson Space Center has established an Advanced Curation program that is tasked with developing procedures, technologies, and data sets necessary for the curation of future astromaterials collections as envisioned by NASA exploration goals. One particular objective of the Advanced Curation program is the development of new methods for the collection, storage, handling and characterization of small (less than 100 micrometer) particles. Astromaterials Curation currently maintains four small particle collections: Cosmic Dust that has been collected in Earth's stratosphere by ER2 and WB-57 aircraft, Comet 81P/Wild 2 dust returned by NASA's Stardust spacecraft, interstellar dust that was returned by Stardust, and asteroid Itokawa particles that were returned by the JAXA's Hayabusa spacecraft. NASA Curation is currently preparing for the anticipated return of two new astromaterials collections - asteroid Ryugu regolith to be collected by Hayabusa2 spacecraft in 2021 (samples will be provided by JAXA as part of an international agreement), and asteroid Bennu regolith to be collected by the OSIRIS-REx spacecraft and returned in 2023. A substantial portion of these returned samples are expected to consist of small particle components, and mission requirements necessitate the development of new processing tools and methods in order to maximize the scientific yield from these valuable acquisitions. Here we describe initial progress towards the development of applicable sample handling methods for the successful curation of future small particle collections

    Hierarchical Bayesian method for detecting continuous gravitational waves from an ensemble of pulsars

    Get PDF
    When looking for gravitational wave signals from known pulsars, targets have been treated using independent searches. Here we use a hierarchical Bayesian framework to combine observations from individual sources for two purposes: to produce a detection statistic for the whole ensemble of sources within a search, and, to estimate the hyperparameters of the underlying distribution of pulsar ellipticities. Both purposes require us to assume some functional form of the ellipticity distribution, and as a proof of principle we take two toy distributions. One is an exponential distribution,defined by its mean, and the other is a half-Gaussian distribution defined by its width. We show that by incorporating a common parameterized prior ellipticity distribution we can be more efficient at detecting gravitational waves from the whole ensemble of sources than trying to combine observations with a simpler non-hierarchical method. This may allow us to detect gravitational waves from the ensemble before there is confident detection of any single source. We also apply this method using data for 92 pulsars from LIGO's sixth science run. No evidence for a signal was seen, but 90\% upper limits of 3.9\ee{-8} and 4.7\ee{-8} were set on the mean of an assumed exponential ellipticity distribution and the width of an assumed half-Gaussian ellipticity distribution, respectively

    Avoiding selection bias in gravitational wave astronomy

    Get PDF
    When searching for gravitational waves in the data from ground-based gravitational wave detectors it is common to use a detection threshold to reduce the number of background events which are unlikely to be the signals of interest. However, imposing such a threshold will also discard some real signals with low amplitude, which can potentially bias any inferences drawn from the population of detected signals. We show how this selection bias is naturally avoided by using the full information from the search, considering both the selected data and our ignorance of the data that are thrown away, and considering all relevant signal and noise models. This approach produces unbiased estimates of parameters even in the presence of false alarms and incomplete data. This can be seen as an extension of previous methods into the high false rate regime where we are able to show that the quality of parameter inference can be optimised by lowering thresholds and increasing the false alarm rate.Comment: 13 pages, 2 figure
    • …
    corecore