3,597,438 research outputs found

    Software for Data Analysis

    Get PDF

    A GPU-based survey for millisecond radio transients using ARTEMIS

    Get PDF
    Astrophysical radio transients are excellent probes of extreme physical processes originating from compact sources within our Galaxy and beyond. Radio frequency signals emitted from these objects provide a means to study the intervening medium through which they travel. Next generation radio telescopes are designed to explore the vast unexplored parameter space of high time resolution astronomy, but require High Performance Computing (HPC) solutions to process the enormous volumes of data that are produced by these telescopes. We have developed a combined software /hardware solution (code named ARTEMIS) for real-time searches for millisecond radio transients, which uses GPU technology to remove interstellar dispersion and detect millisecond radio bursts from astronomical sources in real-time. Here we present an introduction to ARTEMIS. We give a brief overview of the software pipeline, then focus specifically on the intricacies of performing incoherent de-dispersion. We present results from two brute-force algorithms. The first is a GPU based algorithm, designed to exploit the L1 cache of the NVIDIA Fermi GPU. Our second algorithm is CPU based and exploits the new AVX units in Intel Sandy Bridge CPUs.Comment: 4 pages, 7 figures. To appear in the proceedings of ADASS XXI, ed. P.Ballester and D.Egret, ASP Conf. Se

    Software Challenges For HL-LHC Data Analysis

    Full text link
    The high energy physics community is discussing where investment is needed to prepare software for the HL-LHC and its unprecedented challenges. The ROOT project is one of the central software players in high energy physics since decades. From its experience and expectations, the ROOT team has distilled a comprehensive set of areas that should see research and development in the context of data analysis software, for making best use of HL-LHC's physics potential. This work shows what these areas could be, why the ROOT team believes investing in them is needed, which gains are expected, and where related work is ongoing. It can serve as an indication for future research proposals and cooperations

    Field tests for the ESPRESSO data analysis software

    Get PDF
    The data analysis software (DAS) for VLT ESPRESSO is aimed to set a new benchmark in the treatment of spectroscopic data towards the extremely-large-telescope era, providing carefully designed, fully interactive recipes to take care of complex analysis operations (e.g. radial velocity estimation in stellar spectra, interpretation of the absorption features in quasar spectra). A few months away from the instrument's first light, the DAS is now mature for science validation, with most algorithms already implemented and operational. In this paper, I will showcase the DAS features which are currently employed on high-resolution HARPS and UVES spectra to assess the scientific reliability of the recipes and their range of application. I will give a glimpse on the science that will be possible when ESPRESSO data become available, with a particular focus on the novel approach that has been adopted to simultaneously fit the emission continuum and the absorption lines in the Lyman-alpha forest of quasar spectra.Comment: 4 pages, 1 figure; proceedings of ADASS XXVI, accepted by ASP Conference Serie

    Making Software Cost Data Available for Meta-Analysis

    Get PDF
    In this paper we consider the increasing need for meta-analysis within empirical software engineering. However, we also note that a necessary precondition to such forms of analysis is to have both the results in an appropriate format and sufficient contextual information to avoid misleading inferences. We consider the implications in the field of software project effort estimation and show that for a sample of 12 seemingly similar published studies, the results are difficult to compare let alone combine. This is due to different reporting conventions. We argue that a protocol is required and make some suggestions as to what it should contain

    Data Analysis Software for the ESPRESSO Science Machine

    Get PDF
    ESPRESSO is an extremely stable high-resolution spectrograph which is currently being developed for the ESO VLT. With its groundbreaking characteristics it is aimed to be a "science machine", i.e., a fully-integrated instrument to directly extract science information from the observations. In particular, ESPRESSO will be the first ESO instrument to be equipped with a dedicated tool for the analysis of data, the Data Analysis Software (DAS), consisting in a number of recipes to analyze both stellar and quasar spectra. Through the new ESO Reflex GUI, the DAS (which will implement new algorithms to analyze quasar spectra) is aimed to get over the shortcomings of the existing software providing multiple iteration modes and full interactivity with the data.Comment: 5 pages, 2 figures; proceedings of ADASS XXI

    Software reliability experiments data analysis and investigation

    Get PDF
    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests

    The Need for a Versioned Data Analysis Software Environment

    Full text link
    Scientific results in high-energy physics and in many other fields often rely on complex software stacks. In order to support reproducibility and scrutiny of the results, it is good practice to use open source software and to cite software packages and versions. With ever-growing complexity of scientific software on one side and with IT life-cycles of only a few years on the other side, however, it turns out that despite source code availability the setup and the validation of a minimal usable analysis environment can easily become prohibitively expensive. We argue that there is a substantial gap between merely having access to versioned source code and the ability to create a data analysis runtime environment. In order to preserve all the different variants of the data analysis runtime environment, we developed a snapshotting file system optimized for software distribution. We report on our experience in preserving the analysis environment for high-energy physics such as the software landscape used to discover the Higgs boson at the Large Hadron Collider
    • …
    corecore