3,057 research outputs found

    Tree Buffers

    Get PDF
    In runtime verification, the central problem is to decide if a given program execution violates a given property. In online runtime verification, a monitor observes a program’s execution as it happens. If the program being observed has hard real-time constraints, then the monitor inherits them. In the presence of hard real-time constraints it becomes a challenge to maintain enough information to produce error traces, should a property violation be observed. In this paper we introduce a data structure, called tree buffer, that solves this problem in the context of automata-based monitors: If the monitor itself respects hard real-time constraints, then enriching it by tree buffers makes it possible to provide error traces, which are essential for diagnosing defects. We show that tree buffers are also useful in other application domains. For example, they can be used to implement functionality of capturing groups in regular expressions. We prove optimal asymptotic bounds for our data structure, and validate them using empirical data from two sources: regular expression searching through Wikipedia, and runtime verification of execution traces obtained from the DaCapo test suite

    Effect of high temperature heat treatments on the quality factor of a large-grain superconducting radio-frequency niobium cavity

    Get PDF
    Large-grain Nb has become a viable alternative to fine-grain Nb for the fabrication of superconducting radio-frequency cavities. In this contribution we report the results from a heat treatment study of a large-grain 1.5 GHz single-cell cavity made of "medium purity" Nb. The baseline surface preparation prior to heat treatment consisted of standard buffered chemical polishing. The heat treatment in the range 800 - 1400 C was done in a newly designed vacuum induction furnace. Q0 values of the order of 2x1010 at 2.0 K and peak surface magnetic field (Bp) of 90 mT were achieved reproducibly. A Q0-value of (5+-1)1010 at 2.0 K and Bp = 90 mT was obtained after heat treatment at 1400 C. This is the highest value ever reported at this temperature, frequency and field. Samples heat treated with the cavity at 1400 C were analyzed by secondary ion mass spectrometry, secondary electron microscopy, energy dispersive X-ray, point contact tunneling and X-ray diffraction and revealed a complex surface composition which includes titanium oxide, increased carbon and nitrogen content but reduced hydrogen concentration compared to a non heat-treated sample

    Pushing the limits of the NuSTAR detectors

    Get PDF
    NuSTAR (the Nuclear Spectroscopic Telescope ARray) is a NASA Small Explorer (SMEX) mission launched in June of 2012. Since its launch, NuSTAR has been the preeminent instrument for spectroscopic analysis of the hard X-ray sky over the 3-80 keV bandpass. The low energy side of the bandpass is limited by the absorption along the photon path as well as by the ability of the pixels to trigger on incident photons. The on-board calibration source does not have a low-energy line that we can use to calibrate this part of the response, so instead we use the "nearest-neighbor" readout in the NuSTAR detector architecture to calibrate the individual pixel thresholds for all 8 flight detectors on both focal plane modules (FPMs). These threshold measurements feed back into the quantum efficiency of the detectors at low (<5 keV) energies and, once well-calibrated, may allow the use of NuSTAR data below the current 3 keV limit

    Design and analysis of randomized clinical trials requiring prolonged observation of each patient. I. Introduction and design.

    Get PDF
    The Medical Research Council has for some years encouraged collaborative clinical trials in leukaemia and other cancers, reporting the results in the medical literature. One unreported result which deserves such publication is the development of the expertise to design and analyse such trials. This report was prepared by a group of British and American statisticians, but it is intended for people without any statistical expertise. Part I, which appears in this issue, discusses the design of such trials; Part II, which will appear separately in the January 1977 issue of the Journal, gives full instructions for the statistical analysis of such trials by means of life tables and the logrank test, including a worked example, and discusses the interpretation of trial results, including brief reports of 2 particular trials. Both parts of this report are relevant to all clinical trials which study time to death, and wound be equally relevant to clinical trials which study time to other particular classes of untoward event: first stroke, perhaps, or first relapse, metastasis, disease recurrence, thrombosis, transplant rejection, or death from a particular cause. Part I, in this issue, collects together ideas that have mostly already appeared in the medical literature, but Part II, next month, is the first simple account yet published for non-statistical physicians of how to analyse efficiently data from clinical trials of survival duration. Such trials include the majority of all clinical trials of cancer therapy; in cancer trials,however, it may be preferable to use these statistical methods to study time to local recurrence of tumour, or to study time to detectable metastatic spread, in addition to studying total survival. Solid tumours can be staged at diagnosis; if this, or any other available information in some other disease is an important determinant of outcome, it can be used to make the overall logrank test for the whole heterogeneous trial population more sensitive, and more intuitively satisfactory, for it will then only be necessary to compare like with like, and not, by chance, Stage I with Stage III

    Design and analysis of randomized clinical trials requiring prolonged observation of each patient. II. analysis and examples.

    Get PDF
    Part I of this report appeared in the previous issue (Br. J. Cancer (1976) 34,585), and discussed the design of randomized clinical trials. Part II now describes efficient methods of analysis of randomized clinical trials in which we wish to compare the duration of survival (or the time until some other untoward event first occurs) among different groups of patients. It is intended to enable physicians without statistical training either to analyse such data themselves using life tables, the logrank test and retrospective stratification, or, when such analyses are presented, to appreciate them more critically, but the discussion may also be of interest to statisticians who have not yet specialized in clinical trial analyses

    Revealing the spectral state transition of the Clocked Burster, GS 1826-238 with NuSTAR StrayCats

    Full text link
    We present the long term analysis of GS 1826-238, a neutron star X-ray binary known as the "Clocked Burster", using data from NuSTAR StrayCats. StrayCats, a catalogue of NuSTAR stray light data, contains data from bright, off-axis X-ray sources that have not been focused by the NuSTAR optics. We obtained stray light observations of the source from 2014-2021, reduced and analyzed the data using nustar-gen-utils Python tools, demonstrating the transition of source from the "island" atoll state to a "banana" branch. We also present the lightcurve analysis of Type I X-Ray bursts from the Clocked Burster and show that the bursts from the banana/soft state are systematically shorter in durations than those from the island/hard state and have a higher burst fluence. From our analysis, we note an increase in mass accretion rate of the source, and a decrease in burst frequency with the transition
    corecore