11,026 research outputs found

    Enhancing the significance of gravitational wave bursts through signal classification

    Get PDF
    The quest to observe gravitational waves challenges our ability to discriminate signals from detector noise. This issue is especially relevant for transient gravitational waves searches with a robust eyes wide open approach, the so called all- sky burst searches. Here we show how signal classification methods inspired by broad astrophysical characteristics can be implemented in all-sky burst searches preserving their generality. In our case study, we apply a multivariate analyses based on artificial neural networks to classify waves emitted in compact binary coalescences. We enhance by orders of magnitude the significance of signals belonging to this broad astrophysical class against the noise background. Alternatively, at a given level of mis-classification of noise events, we can detect about 1/4 more of the total signal population. We also show that a more general strategy of signal classification can actually be performed, by testing the ability of artificial neural networks in discriminating different signal classes. The possible impact on future observations by the LIGO-Virgo network of detectors is discussed by analysing recoloured noise from previous LIGO-Virgo data with coherent WaveBurst, one of the flagship pipelines dedicated to all-sky searches for transient gravitational waves

    Searching for periodic sources with LIGO. II: Hierarchical searches

    Full text link
    The detection of quasi-periodic sources of gravitational waves requires the accumulation of signal-to-noise over long observation times. If not removed, Earth-motion induced Doppler modulations, and intrinsic variations of the gravitational-wave frequency make the signals impossible to detect. These effects can be corrected (removed) using a parameterized model for the frequency evolution. We compute the number of independent corrections Np(ΔT,N)N_p(\Delta T,N) required for incoherent search strategies which use stacked power spectra---a demodulated time series is divided into NN segments of length ΔT\Delta T, each segment is FFTed, the power is computed, and the NN spectra are summed up. We estimate that the sensitivity of an all-sky search that uses incoherent stacks is a factor of 2--4 better than would be achieved using coherent Fourier transforms; incoherent methods are computationally efficient at exploring large parameter spaces. A two-stage hierarchical search which yields another 20--60% improvement in sensitivity in all-sky searches for old (>= 1000 yr) slow (= 40 yr) fast (<= 1000 Hz) pulsars. Assuming 10^{12} flops of effective computing power for data analysis, enhanced LIGO interferometers should be sensitive to: (i) Galactic core pulsars with gravitational ellipticities of \epsilon\agt5\times 10^{-6} at 200 Hz, (ii) Gravitational waves emitted by the unstable r-modes of newborn neutron stars out to distances of ~8 Mpc, and (iii) neutron stars in LMXB's with x-ray fluxes which exceed 2×108erg/(cm2s)2 \times 10^{-8} erg/(cm^2 s). Moreover, gravitational waves from the neutron star in Sco X-1 should be detectable is the interferometer is operated in a signal-recycled, narrow-band configuration.Comment: 22 Pages, 13 Figure

    Matched filtering of gravitational waves from inspiraling compact binaries: Computational cost and template placement

    Get PDF
    We estimate the number of templates, computational power, and storage required for a one-step matched filtering search for gravitational waves from inspiraling compact binaries. These estimates should serve as benchmarks for the evaluation of more sophisticated strategies such as hierarchical searches. We use waveform templates based on the second post-Newtonian approximation for binaries composed of nonspinning compact bodies in circular orbits. We present estimates for six noise curves: LIGO (three configurations), VIRGO, GEO600, and TAMA. To search for binaries with components more massive than 0.2M_o while losing no more than 10% of events due to coarseness of template spacing, initial LIGO will require about 1*10^11 flops (floating point operations per second) for data analysis to keep up with data acquisition. This is several times higher than estimated in previous work by Owen, in part because of the improved family of templates and in part because we use more realistic (higher) sampling rates. Enhanced LIGO, GEO600, and TAMA will require computational power similar to initial LIGO. Advanced LIGO will require 8*10^11 flops, and VIRGO will require 5*10^12 flops. If the templates are stored rather than generated as needed, storage requirements range from 1.5*10^11 real numbers for TAMA to 6*10^14 for VIRGO. We also sketch and discuss an algorithm for placing the templates in the parameter space.Comment: 15 pages, 4 figures, submitted to Phys. Rev.

    A Solution to the Galactic Foreground Problem for LISA

    Full text link
    Low frequency gravitational wave detectors, such as the Laser Interferometer Space Antenna (LISA), will have to contend with large foregrounds produced by millions of compact galactic binaries in our galaxy. While these galactic signals are interesting in their own right, the unresolved component can obscure other sources. The science yield for the LISA mission can be improved if the brighter and more isolated foreground sources can be identified and regressed from the data. Since the signals overlap with one another we are faced with a ``cocktail party'' problem of picking out individual conversations in a crowded room. Here we present and implement an end-to-end solution to the galactic foreground problem that is able to resolve tens of thousands of sources from across the LISA band. Our algorithm employs a variant of the Markov Chain Monte Carlo (MCMC) method, which we call the Blocked Annealed Metropolis-Hastings (BAM) algorithm. Following a description of the algorithm and its implementation, we give several examples ranging from searches for a single source to searches for hundreds of overlapping sources. Our examples include data sets from the first round of Mock LISA Data Challenges.Comment: 19 pages, 27 figure

    A Proposed Search for the Detection of Gravitational Waves from Eccentric Binary Black Holes

    Full text link
    Most of compact binary systems are expected to circularize before the frequency of emitted gravitational waves (GWs) enters the sensitivity band of the ground based interferometric detectors. However, several mechanisms have been proposed for the formation of binary systems, which retain eccentricity throughout their lifetimes. Since no matched-filtering algorithm has been developed to extract continuous GW signals from compact binaries on orbits with low to moderate values of eccentricity, and available algorithms to detect binaries on quasi-circular orbits are sub-optimal to recover these events, in this paper we propose a search method for detection of gravitational waves produced from the coalescences of eccentric binary black holes (eBBH). We study the search sensitivity and the false alarm rates on a segment of data from the second joint science run of LIGO and Virgo detectors, and discuss the implications of the eccentric binary search for the advanced GW detectors

    Sliding coherence window technique for hierarchical detection of continuous gravitational waves

    Full text link
    A novel hierarchical search technique is presented for all-sky surveys for continuous gravitational-wave sources, such as rapidly spinning nonaxisymmetric neutron stars. Analyzing yearlong detector data sets over realistic ranges of parameter space using fully coherent matched-filtering is computationally prohibitive. Thus more efficient, so-called hierarchical techniques are essential. Traditionally, the standard hierarchical approach consists of dividing the data into nonoverlapping segments of which each is coherently analyzed and subsequently the matched-filter outputs from all segments are combined incoherently. The present work proposes to break the data into subsegments shorter than the desired maximum coherence time span (size of the coherence window). Then matched-filter outputs from the different subsegments are efficiently combined by sliding the coherence window in time: Subsegments whose timestamps are closer than coherence window size are combined coherently, otherwise incoherently. Compared to the standard scheme at the same coherence time baseline, data sets longer by about 50-100% would have to be analyzed to achieve the same search sensitivity as with the sliding coherence window approach. Numerical simulations attest to the analytically estimated improvement.Comment: 11 pages, 4 figure

    Gravitational waves: search results, data analysis and parameter estimation

    Get PDF
    The Amaldi 10 Parallel Session C2 on gravitational wave (GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity

    Use of the MultiNest algorithm for gravitational wave data analysis

    Full text link
    We describe an application of the MultiNest algorithm to gravitational wave data analysis. MultiNest is a multimodal nested sampling algorithm designed to efficiently evaluate the Bayesian evidence and return posterior probability densities for likelihood surfaces containing multiple secondary modes. The algorithm employs a set of live points which are updated by partitioning the set into multiple overlapping ellipsoids and sampling uniformly from within them. This set of live points climbs up the likelihood surface through nested iso-likelihood contours and the evidence and posterior distributions can be recovered from the point set evolution. The algorithm is model-independent in the sense that the specific problem being tackled enters only through the likelihood computation, and does not change how the live point set is updated. In this paper, we consider the use of the algorithm for gravitational wave data analysis by searching a simulated LISA data set containing two non-spinning supermassive black hole binary signals. The algorithm is able to rapidly identify all the modes of the solution and recover the true parameters of the sources to high precision.Comment: 18 pages, 4 figures, submitted to Class. Quantum Grav; v2 includes various changes in light of referee's comment
    corecore