37,666 research outputs found

    Performance of the distributed central analysis in BaBar

    Get PDF
    The total dataset produced by the BaBar experiment at the Stanford Linear Accelerator Center (SLAC) currently comprises roughly3times1093times 10^9data events and an equal amount of simulated events, corresponding to 23 Tbytes of real data and 51 Tbytes simulated events. Since individual analyses typically select a very small fraction of all events, it would be extremely inefficient if each analysis had to process the full dataset. A first, centrally managed analysis step is therefore a common pre-selection (‘skimming’) of all data according to very loose, inclusive criteria to facilitate data access for later analysis. Usually, there are common selection criteria for several analysis. However, they may change over time, e.g., when new analyses are developed. Currently,$cal

    Neutrino Detection with Inclined Air Showers

    Full text link
    The possibilities of detecting high energy neutrinos through inclined showers produced in the atmosphere are addressed with an emphasis on the detection of air showers by arrays of particle detectors. Rates of inclined showers produced by both down-going neutrino interactions and by up-coming τ\tau decays from earth-skimming neutrinos as a function of shower energy are calculated with analytical methods using two sample neutrino fluxes with different spectral indices. The relative contributions from different flavors and charged, neutral current and resonant interactions are compared for down-going neutrinos interacting in the atmosphere. No detailed description of detectors is attempted but rough energy thresholds are implemented to establish the ranges of energies which are more suitable for neutrino detection through inclined showers. Down-going and up-coming rates are compared.Comment: Submitted to New Journal of Physic

    How the Experts Algorithm Can Help Solve LPs Online

    Full text link
    We consider the problem of solving packing/covering LPs online, when the columns of the constraint matrix are presented in random order. This problem has received much attention and the main focus is to figure out how large the right-hand sides of the LPs have to be (compared to the entries on the left-hand side of the constraints) to allow (1+ϵ)(1+\epsilon)-approximations online. It is known that the right-hand sides have to be Ω(ϵ2logm)\Omega(\epsilon^{-2} \log m) times the left-hand sides, where mm is the number of constraints. In this paper we give a primal-dual algorithm that achieve this bound for mixed packing/covering LPs. Our algorithms construct dual solutions using a regret-minimizing online learning algorithm in a black-box fashion, and use them to construct primal solutions. The adversarial guarantee that holds for the constructed duals helps us to take care of most of the correlations that arise in the algorithm; the remaining correlations are handled via martingale concentration and maximal inequalities. These ideas lead to conceptually simple and modular algorithms, which we hope will be useful in other contexts.Comment: An extended abstract appears in the 22nd European Symposium on Algorithms (ESA 2014

    Transport innovation and areal association in the Manawatu dairy industry : the role of transport from before 1880 to the present day and the impact of innovation in the areal association between supplier and factory and between factory and factory : a thesis presented in partial fulfilment of the requirements for the degree of Master of Arts in Geography at Massey University

    Get PDF
    For the New Zealand dairy industry, "the principal - one might say the only important disadvantage - was the obstacle of distance...." (Philpott, 1937:11) Although concerned here with the difficulties of overseas transport, (he suggested that time and invention had largely overcome the obstacles of distance) the comment is equally applicable to the difficulties of internal transport. Transport is an important element in dairying but appears to have attracted little attention from researchers. A review of the history of dairying reveals a series of development phases, each of which appears related to transport developments. The first part of this thesis, then, is an historical review of the period from before 1880 to the present day with particular emphasis upon transport methods and innovations. Emphasis has been given, however, to developments at tho factory rather than the farm level. From a consideration of these historical developments, it becomes increasingly evident that each phase has been associated with distinctive patterns of land use and the development of specialised dairying "regions"

    Hierarchical video summarisation in reference frame subspace

    Get PDF
    In this paper, a hierarchical video structure summarization approach using Laplacian Eigenmap is proposed, where a small set of reference frames is selected from the video sequence to form a reference subspace to measure the dissimilarity between two arbitrary frames. In the proposed summarization scheme, the shot-level key frames are first detected from the continuity of inter-frame dissimilarity, and the sub-shot level and scene level representative frames are then summarized by using k-mean clustering. The experiment is carried on both test videos and movies, and the results show that in comparison with a similar approach using latent semantic analysis, the proposed approach using Laplacian Eigenmap can achieve a better recall rate in keyframe detection, and gives an efficient hierarchical summarization at sub shot, shot and scene levels subsequently

    Black Holes from Cosmic Rays: Probes of Extra Dimensions and New Limits on TeV-Scale Gravity

    Full text link
    If extra spacetime dimensions and low-scale gravity exist, black holes will be produced in observable collisions of elementary particles. For the next several years, ultra-high energy cosmic rays provide the most promising window on this phenomenon. In particular, cosmic neutrinos can produce black holes deep in the Earth's atmosphere, leading to quasi-horizontal giant air showers. We determine the sensitivity of cosmic ray detectors to black hole production and compare the results to other probes of extra dimensions. With n \ge 4 extra dimensions, current bounds on deeply penetrating showers from AGASA already provide the most stringent bound on low-scale gravity, requiring a fundamental Planck scale M_D > 1.3 - 1.8 TeV. The Auger Observatory will probe M_D as large as 4 TeV and may observe on the order of a hundred black holes in 5 years. We also consider the implications of angular momentum and possible exponentially suppressed parton cross sections; including these effects, large black hole rates are still possible. Finally, we demonstrate that even if only a few black hole events are observed, a standard model interpretation may be excluded by comparison with Earth-skimming neutrino rates.Comment: 30 pages, 18 figures; v2: discussion of gravitational infall, AGASA and Fly's Eye comparison added; v3: Earth-skimming results modified and strengthened, published versio

    Use of mathematical derivatives (time-domain differentiation) on chromatographic data to enhance the detection and quantification of an unknown 'rider' peak

    Get PDF
    Two samples of an anticancer prodrug, AQ4N, were submitted for HPLC assay and showed an unidentified impurity that eluted as a 'rider' on the tail of the main peak. Mathematical derivatization of the chromatograms offered several advantages over conventional skimmed integration. A combination of the second derivative amplitude and simple linear regression gave a novel method for estimating the true peak area of the impurity peak. All the calculation steps were carried out using a widely available spreadsheet program. (C) 2003 Elsevier B.V. All rights reserved

    The Performance of Performance Standards

    Get PDF
    This paper examines the performance of the JTPA performance system, a widely emulated model for inducing efficiency in government organizations. We present a model of how performance incentives may distort bureaucratic decisions. We define cream skimming within the model. Two major empirical findings are (a) that the short run measures used to monitor performance are weakly, and sometimes perversely, related to long run impacts and (b) that the efficiency gains or losses from cream skimming are small. We find evidence that centers respond to performance standards.
    corecore