14,101 research outputs found

    SHARP: Automated monitoring of spacecraft health and status

    Get PDF
    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989

    Data-driven detection of multi-messenger transients

    Full text link
    The primary challenge in the study of explosive astrophysical transients is their detection and characterisation using multiple messengers. For this purpose, we have developed a new data-driven discovery framework, based on deep learning. We demonstrate its use for searches involving neutrinos, optical supernovae, and gamma rays. We show that we can match or substantially improve upon the performance of state-of-the-art techniques, while significantly minimising the dependence on modelling and on instrument characterisation. Particularly, our approach is intended for near- and real-time analyses, which are essential for effective follow-up of detections. Our algorithm is designed to combine a range of instruments and types of input data, representing different messengers, physical regimes, and temporal scales. The methodology is optimised for agnostic searches of unexpected phenomena, and has the potential to substantially enhance their discovery prospects.Comment: 16 page

    Evolution of the Reactor Antineutrino Flux and Spectrum at Daya Bay

    Full text link
    The Daya Bay experiment has observed correlations between reactor core fuel evolution and changes in the reactor antineutrino flux and energy spectrum. Four antineutrino detectors in two experimental halls were used to identify 2.2 million inverse beta decays (IBDs) over 1230 days spanning multiple fuel cycles for each of six 2.9 GWth_{\textrm{th}} reactor cores at the Daya Bay and Ling Ao nuclear power plants. Using detector data spanning effective 239^{239}Pu fission fractions, F239F_{239}, from 0.25 to 0.35, Daya Bay measures an average IBD yield, σˉf\bar{\sigma}_f, of (5.90±0.13)×10−43(5.90 \pm 0.13) \times 10^{-43} cm2^2/fission and a fuel-dependent variation in the IBD yield, dσf/dF239d\sigma_f/dF_{239}, of (−1.86±0.18)×10−43(-1.86 \pm 0.18) \times 10^{-43} cm2^2/fission. This observation rejects the hypothesis of a constant antineutrino flux as a function of the 239^{239}Pu fission fraction at 10 standard deviations. The variation in IBD yield was found to be energy-dependent, rejecting the hypothesis of a constant antineutrino energy spectrum at 5.1 standard deviations. While measurements of the evolution in the IBD spectrum show general agreement with predictions from recent reactor models, the measured evolution in total IBD yield disagrees with recent predictions at 3.1σ\sigma. This discrepancy indicates that an overall deficit in measured flux with respect to predictions does not result from equal fractional deficits from the primary fission isotopes 235^{235}U, 239^{239}Pu, 238^{238}U, and 241^{241}Pu. Based on measured IBD yield variations, yields of (6.17±0.17)(6.17 \pm 0.17) and (4.27±0.26)×10−43(4.27 \pm 0.26) \times 10^{-43} cm2^2/fission have been determined for the two dominant fission parent isotopes 235^{235}U and 239^{239}Pu. A 7.8% discrepancy between the observed and predicted 235^{235}U yield suggests that this isotope may be the primary contributor to the reactor antineutrino anomaly.Comment: 7 pages, 5 figure

    Realtime market microstructure analysis: online Transaction Cost Analysis

    Full text link
    Motivated by the practical challenge in monitoring the performance of a large number of algorithmic trading orders, this paper provides a methodology that leads to automatic discovery of the causes that lie behind a poor trading performance. It also gives theoretical foundations to a generic framework for real-time trading analysis. Academic literature provides different ways to formalize these algorithms and show how optimal they can be from a mean-variance, a stochastic control, an impulse control or a statistical learning viewpoint. This paper is agnostic about the way the algorithm has been built and provides a theoretical formalism to identify in real-time the market conditions that influenced its efficiency or inefficiency. For a given set of characteristics describing the market context, selected by a practitioner, we first show how a set of additional derived explanatory factors, called anomaly detectors, can be created for each market order. We then will present an online methodology to quantify how this extended set of factors, at any given time, predicts which of the orders are underperforming while calculating the predictive power of this explanatory factor set. Armed with this information, which we call influence analysis, we intend to empower the order monitoring user to take appropriate action on any affected orders by re-calibrating the trading algorithms working the order through new parameters, pausing their execution or taking over more direct trading control. Also we intend that use of this method in the post trade analysis of algorithms can be taken advantage of to automatically adjust their trading action.Comment: 33 pages, 12 figure

    Bayesian Methods for Analysis and Adaptive Scheduling of Exoplanet Observations

    Full text link
    We describe work in progress by a collaboration of astronomers and statisticians developing a suite of Bayesian data analysis tools for extrasolar planet (exoplanet) detection, planetary orbit estimation, and adaptive scheduling of observations. Our work addresses analysis of stellar reflex motion data, where a planet is detected by observing the "wobble" of its host star as it responds to the gravitational tug of the orbiting planet. Newtonian mechanics specifies an analytical model for the resulting time series, but it is strongly nonlinear, yielding complex, multimodal likelihood functions; it is even more complex when multiple planets are present. The parameter spaces range in size from few-dimensional to dozens of dimensions, depending on the number of planets in the system, and the type of motion measured (line-of-sight velocity, or position on the sky). Since orbits are periodic, Bayesian generalizations of periodogram methods facilitate the analysis. This relies on the model being linearly separable, enabling partial analytical marginalization, reducing the dimension of the parameter space. Subsequent analysis uses adaptive Markov chain Monte Carlo methods and adaptive importance sampling to perform the integrals required for both inference (planet detection and orbit measurement), and information-maximizing sequential design (for adaptive scheduling of observations). We present an overview of our current techniques and highlight directions being explored by ongoing research.Comment: 29 pages, 11 figures. An abridged version is accepted for publication in Statistical Methodology for a special issue on astrostatistics, with selected (refereed) papers presented at the Astronomical Data Analysis Conference (ADA VI) held in Monastir, Tunisia, in May 2010. Update corrects equation (3
    • …
    corecore