117,705 research outputs found

    The optimal search for an astrophysical gravitational-wave background

    Get PDF
    Roughly every 2-10 minutes, a pair of stellar mass black holes merge somewhere in the Universe. A small fraction of these mergers are detected as individually resolvable gravitational-wave events by advanced detectors such as LIGO and Virgo. The rest contribute to a stochastic background. We derive the statistically optimal search strategy for a background of unresolved binaries. Our method applies Bayesian parameter estimation to all available data. Using Monte Carlo simulations, we demonstrate that the search is both "safe" and effective: it is not fooled by instrumental artefacts such as glitches, and it recovers simulated stochastic signals without bias. Given realistic assumptions, we estimate that the search can detect the binary black hole background with about one day of design sensitivity data versus ≈40\approx 40 months using the traditional cross-correlation search. This framework independently constrains the merger rate and black hole mass distribution, breaking a degeneracy present in the cross-correlation approach. The search provides a unified framework for population studies of compact binaries, which is cast in terms of hyper-parameter estimation. We discuss a number of extensions and generalizations including: application to other sources (such as binary neutron stars and continuous-wave sources), simultaneous estimation of a continuous Gaussian background, and applications to pulsar timing.Comment: 16 pages, 9 figure

    A Mock Data and Science Challenge for Detecting an Astrophysical Stochastic Gravitational-Wave Background with Advanced LIGO and Advanced Virgo

    Full text link
    The purpose of this mock data and science challenge is to prepare the data analysis and science interpretation for the second generation of gravitational-wave experiments Advanced LIGO-Virgo in the search for a stochastic gravitational-wave background signal of astrophysical origin. Here we present a series of signal and data challenges, with increasing complexity, whose aim is to test the ability of current data analysis pipelines at detecting an astrophysically produced gravitational-wave background, test parameter estimation methods and interpret the results. We introduce the production of these mock data sets that includes a realistic observing scenario data set where we account for different sensitivities of the advanced detectors as they are continuously upgraded toward their design sensitivity. After analysing these with the standard isotropic cross-correlation pipeline we find that we are able to recover the injected gravitational-wave background energy density to within 2σ2\sigma for all of the data sets and present the results from the parameter estimation. The results from this mock data and science challenge show that advanced LIGO and Virgo will be ready and able to make a detection of an astrophysical gravitational-wave background within a few years of operations of the advanced detectors, given a high enough rate of compact binary coalescing events

    Gravitational waves: search results, data analysis and parameter estimation

    Get PDF
    The Amaldi 10 Parallel Session C2 on gravitational wave (GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity

    Stochastic superspace phenomenology at the Large Hadron Collider

    Full text link
    We analyse restrictions on the stochastic superspace parameter space arising from 1 fb−1^{-1} of LHC data, and bounds on sparticle masses, cold dark matter relic density and the branching ratio of the process Bs→μ+μ−B_s \rightarrow \mu^+ \mu^-. A region of parameter space consistent with these limits is found where the stochasticity parameter, \xi, takes values in the range -2200 GeV < \xi < -900 GeV, provided the cutoff scale is O(1018)\mathcal{O}(10^{18}) GeV.Comment: 9 pages, 13 figure

    The Block Point Process Model for Continuous-Time Event-Based Dynamic Networks

    Full text link
    We consider the problem of analyzing timestamped relational events between a set of entities, such as messages between users of an on-line social network. Such data are often analyzed using static or discrete-time network models, which discard a significant amount of information by aggregating events over time to form network snapshots. In this paper, we introduce a block point process model (BPPM) for continuous-time event-based dynamic networks. The BPPM is inspired by the well-known stochastic block model (SBM) for static networks. We show that networks generated by the BPPM follow an SBM in the limit of a growing number of nodes. We use this property to develop principled and efficient local search and variational inference procedures initialized by regularized spectral clustering. We fit BPPMs with exponential Hawkes processes to analyze several real network data sets, including a Facebook wall post network with over 3,500 nodes and 130,000 events.Comment: To appear at The Web Conference 201

    Credit Assignment in Adaptive Evolutionary Algorithms

    Get PDF
    In this paper, a new method for assigning credit to search\ud operators is presented. Starting with the principle of optimizing\ud search bias, search operators are selected based on an ability to\ud create solutions that are historically linked to future generations.\ud Using a novel framework for defining performance\ud measurements, distributing credit for performance, and the\ud statistical interpretation of this credit, a new adaptive method is\ud developed and shown to outperform a variety of adaptive and\ud non-adaptive competitors

    Observability of Dark Matter Substructure with Pulsar Timing Correlations

    Get PDF
    Dark matter substructure on small scales is currently weakly constrained, and its study may shed light on the nature of the dark matter. In this work we study the gravitational effects of dark matter substructure on measured pulsar phases in pulsar timing arrays (PTAs). Due to the stability of pulse phases observed over several years, dark matter substructure around the Earth-pulsar system can imprint discernible signatures in gravitational Doppler and Shapiro delays. We compute pulsar phase correlations induced by general dark matter substructure, and project constraints for a few models such as monochromatic primordial black holes (PBHs), and Cold Dark Matter (CDM)-like NFW subhalos. This work extends our previous analysis, which focused on static or single transiting events, to a stochastic analysis of multiple transiting events. We find that stochastic correlations, in a PTA similar to the Square Kilometer Array (SKA), are uniquely powerful to constrain subhalos as light as ∼10−13 M⊙\sim 10^{-13}~M_\odot, with concentrations as low as that predicted by standard CDM.Comment: 45 pages, 12 figure
    • …
    corecore