966 research outputs found

    A Virtual Data Grid for LIGO

    Get PDF
    GriPhyN (Grid Physics Network) is a large US collaboration to build grid services for large physics experiments, one of which is LIGO, a gravitational-wave observatory. This paper explains the physics and computing challenges of LIGO, and the tools that GriPhyN will build to address them. A key component needed to implement the data pipeline is a virtual data service; a system to dynamically create data products requested during the various stages. The data could possibly be already processed in a certain way, it may be in a file on a storage system, it may be cached, or it may need to be created through computation. The full elaboration of this system will al-low complex data pipelines to be set up as virtual data objects, with existing data being transformed in diverse ways

    A First Comparison Between LIGO and Virgo Inspiral Search Pipelines

    Get PDF
    This article reports on a project that is the first step the LIGO Scientific Collaboration and the Virgo Collaboration have taken to prepare for the mutual search for inspiral signals. The project involved comparing the analysis pipelines of the two collaborations on data sets prepared by both sides, containing simulated noise and injected events. The ability of the pipelines to detect the injected events was checked, and a first comparison of how the parameters of the events were recovered has been completed.Comment: GWDAW-9 proceeding

    Data Access for LIGO on the OSG

    Full text link
    During 2015 and 2016, the Laser Interferometer Gravitational-Wave Observatory (LIGO) conducted a three-month observing campaign. These observations delivered the first direct detection of gravitational waves from binary black hole mergers. To search for these signals, the LIGO Scientific Collaboration uses the PyCBC search pipeline. To deliver science results in a timely manner, LIGO collaborated with the Open Science Grid (OSG) to distribute the required computation across a series of dedicated, opportunistic, and allocated resources. To deliver the petabytes necessary for such a large-scale computation, our team deployed a distributed data access infrastructure based on the XRootD server suite and the CernVM File System (CVMFS). This data access strategy grew from simply accessing remote storage to a POSIX-based interface underpinned by distributed, secure caches across the OSG.Comment: 6 pages, 3 figures, submitted to PEARC1

    SciTokens: Capability-Based Secure Access to Remote Scientific Data

    Full text link
    The management of security credentials (e.g., passwords, secret keys) for computational science workflows is a burden for scientists and information security officers. Problems with credentials (e.g., expiration, privilege mismatch) cause workflows to fail to fetch needed input data or store valuable scientific results, distracting scientists from their research by requiring them to diagnose the problems, re-run their computations, and wait longer for their results. In this paper, we introduce SciTokens, open source software to help scientists manage their security credentials more reliably and securely. We describe the SciTokens system architecture, design, and implementation addressing use cases from the Laser Interferometer Gravitational-Wave Observatory (LIGO) Scientific Collaboration and the Large Synoptic Survey Telescope (LSST) projects. We also present our integration with widely-used software that supports distributed scientific computing, including HTCondor, CVMFS, and XrootD. SciTokens uses IETF-standard OAuth tokens for capability-based secure access to remote scientific data. The access tokens convey the specific authorizations needed by the workflows, rather than general-purpose authentication impersonation credentials, to address the risks of scientific workflows running on distributed infrastructure including NSF resources (e.g., LIGO Data Grid, Open Science Grid, XSEDE) and public clouds (e.g., Amazon Web Services, Google Cloud, Microsoft Azure). By improving the interoperability and security of scientific workflows, SciTokens 1) enables use of distributed computing for scientific domains that require greater data protection and 2) enables use of more widely distributed computing resources by reducing the risk of credential abuse on remote systems.Comment: 8 pages, 6 figures, PEARC '18: Practice and Experience in Advanced Research Computing, July 22--26, 2018, Pittsburgh, PA, US

    BOSS-LDG: A Novel Computational Framework that Brings Together Blue Waters, Open Science Grid, Shifter and the LIGO Data Grid to Accelerate Gravitational Wave Discovery

    Get PDF
    We present a novel computational framework that connects Blue Waters, the NSF-supported, leadership-class supercomputer operated by NCSA, to the Laser Interferometer Gravitational-Wave Observatory (LIGO) Data Grid via Open Science Grid technology. To enable this computational infrastructure, we configured, for the first time, a LIGO Data Grid Tier-1 Center that can submit heterogeneous LIGO workflows using Open Science Grid facilities. In order to enable a seamless connection between the LIGO Data Grid and Blue Waters via Open Science Grid, we utilize Shifter to containerize LIGO's workflow software. This work represents the first time Open Science Grid, Shifter, and Blue Waters are unified to tackle a scientific problem and, in particular, it is the first time a framework of this nature is used in the context of large scale gravitational wave data analysis. This new framework has been used in the last several weeks of LIGO's second discovery campaign to run the most computationally demanding gravitational wave search workflows on Blue Waters, and accelerate discovery in the emergent field of gravitational wave astrophysics. We discuss the implications of this novel framework for a wider ecosystem of Higher Performance Computing users.Comment: 10 pages, 10 figures. Accepted as a Full Research Paper to the 13th IEEE International Conference on eScienc

    Best network chirplet-chain: Near-optimal coherent detection of unmodeled gravitation wave chirps with a network of detectors

    Full text link
    The searches of impulsive gravitational waves (GW) in the data of the ground-based interferometers focus essentially on two types of waveforms: short unmodeled bursts and chirps from inspiralling compact binaries. There is room for other types of searches based on different models. Our objective is to fill this gap. More specifically, we are interested in GW chirps with an arbitrary phase/frequency vs. time evolution. These unmodeled GW chirps may be considered as the generic signature of orbiting/spinning sources. We expect quasi-periodic nature of the waveform to be preserved independent of the physics which governs the source motion. Several methods have been introduced to address the detection of unmodeled chirps using the data of a single detector. Those include the best chirplet chain (BCC) algorithm introduced by the authors. In the next years, several detectors will be in operation. The joint coherent analysis of GW by multiple detectors can improve the sight horizon, the estimation of the source location and the wave polarization angles. Here, we extend the BCC search to the multiple detector case. The method amounts to searching for salient paths in the combined time-frequency representation of two synthetic streams. The latter are time-series which combine the data from each detector linearly in such a way that all the GW signatures received are added constructively. We give a proof of principle for the full sky blind search in a simplified situation which shows that the joint estimation of the source sky location and chirp frequency is possible.Comment: 22 pages, revtex4, 6 figure

    Improving the efficiency of the detection of gravitational wave signals from inspiraling compact binaries: Chebyshev interpolation

    Full text link
    Inspiraling compact binaries are promising sources of gravitational waves for ground and space-based laser interferometric detectors. The time-dependent signature of these sources in the detectors is a well-characterized function of a relatively small number of parameters; thus, the favored analysis technique makes use of matched filtering and maximum likelihood methods. Current analysis methodology samples the matched filter output at parameter values chosen so that the correlation between successive samples is 97% for which the filtered output is closely correlated. Here we describe a straightforward and practical way of using interpolation to take advantage of the correlation between the matched filter output associated with nearby points in the parameter space to significantly reduce the number of matched filter evaluations without sacrificing the efficiency with which real signals are recognized. Because the computational cost of the analysis is driven almost exclusively by the matched filter evaluations, this translates directly into an increase in computational efficiency, which in turn, translates into an increase in the size of the parameter space that can be analyzed and, thus, the science that can be accomplished with the data. As a demonstration we compare the present "dense sampling" analysis methodology with our proposed "interpolation" methodology, restricted to one dimension of the multi-dimensional analysis problem. We find that the interpolated search reduces by 25% the number of filter evaluations required by the dense search with 97% correlation to achieve the same efficiency of detection for an expected false alarm probability. Generalized to higher dimensional space of a generic binary including spins suggests an order of magnitude increase in computational efficiency.Comment: 23 pages, 5 figures, submitted to Phys. Rev.
    • …
    corecore