787 research outputs found

    BSAURUS- A Package For Inclusive B-Reconstruction in DELPHI

    Get PDF
    BSAURUS is a software package for the inclusive reconstruction of B-hadrons in Z-decay events taken by the DELPHI detector at LEP. The BSAURUS goal is to reconstruct B-decays, by making use of as many properties of b-jets as possible, with high efficiency and good purity. This is achieved by exploiting the capabilities of the DELPHI detector to their extreme, applying wherever possible physics knowledge about B production and decays and combining different information sources with modern tools- mainly artificial neural networks. This note provides a reference of how BSAURUS outputs are formed, how to access them within the DELPHI framework, and the physics performance one can expect.Comment: 52 pages, 24 figures, added author Z.

    A multivariate approach to heavy flavour tagging with cascade training

    Full text link
    This paper compares the performance of artificial neural networks and boosted decision trees, with and without cascade training, for tagging b-jets in a collider experiment. It is shown, using a Monte Carlo simulation of WH→lνqqˉWH \to l\nu q\bar{q} events, that for a b-tagging efficiency of 50%, the light jet rejection power given by boosted decision trees without cascade training is about 55% higher than that given by artificial neural networks. The cascade training technique can improve the performance of boosted decision trees and artificial neural networks at this b-tagging efficiency level by about 35% and 80% respectively. We conclude that the cascade trained boosted decision trees method is the most promising technique for tagging heavy flavours at collider experiments.Comment: 14 pages, 12 figures, revised versio

    Highlights of the SLD Physics Program at the SLAC Linear Collider

    Get PDF
    Starting in 1989, and continuing through the 1990s, high-energy physics witnessed a flowering of precision measurements in general and tests of the standard model in particular, led by e+e- collider experiments operating at the Z0 resonance. Key contributions to this work came from the SLD collaboration at the SLAC Linear Collider. By exploiting the unique capabilities of this pioneering accelerator and the SLD detector, including a polarized electron beam, exceptionally small beam dimensions, and a CCD pixel vertex detector, SLD produced a broad array of electroweak, heavy-flavor, and QCD measurements. Many of these results are one of a kind or represent the world's standard in precision. This article reviews the highlights of the SLD physics program, with an eye toward associated advances in experimental technique, and the contribution of these measurements to our dramatically improved present understanding of the standard model and its possible extensions.Comment: To appear in 2001 Annual Review of Nuclear and Particle Science; 78 pages, 31 figures; A version with higher resolution figures can be seen at http://www.slac.stanford.edu/pubs/slacpubs/8000/slac-pub-8985.html; Second version incorporates minor changes to the tex

    Totem: a case study in HEP

    Get PDF
    It is being proved that the neurochip \Totem{} is a viable solution for high quality and real time computational tasks in HEP, including event classification, triggering and signal processing. The architecture of the chip is based on a "derivative free" algorithm called Reactive Tabu Search (RTS), highly performing even for low precision weights. ISA, VME or PCI boards integrate the chip as a coprocessor in a host computer. This paper presents: 1) the state of the art and the next evolution of the design of \Totem{}; 2) its ability in the Higgs search at LHC as an example.Comment: Latex, elsart.sty, 5 pages, talk presented by I.Lazzizzera at CHEP97 (Berlin, April 1997

    Parameterized Machine Learning for High-Energy Physics

    Get PDF
    We investigate a new structure for machine learning classifiers applied to problems in high-energy physics by expanding the inputs to include not only measured features but also physics parameters. The physics parameters represent a smoothly varying learning task, and the resulting parameterized classifier can smoothly interpolate between them and replace sets of classifiers trained at individual values. This simplifies the training process and gives improved performance at intermediate values, even for complex problems requiring deep learning. Applications include tools parameterized in terms of theoretical model parameters, such as the mass of a particle, which allow for a single network to provide improved discrimination across a range of masses. This concept is simple to implement and allows for optimized interpolatable results.Comment: For submission to PR
    • …
    corecore