993 research outputs found

    Probabilistic Particle Flow Algorithm for High Occupancy Environment

    Full text link
    Algorithms based on the particle flow approach are becoming increasingly utilized in collider experiments due to their superior jet energy and missing energy resolution compared to the traditional calorimeter-based measurements. Such methods have been shown to work well in environments with low occupancy of particles per unit of calorimeter granularity. However, at higher instantaneous luminosity or in detectors with coarse calorimeter segmentation, the overlaps of calorimeter energy deposits from charged and neutral particles significantly complicate particle energy reconstruction, reducing the overall energy resolution of the method. We present a technique designed to resolve overlapping energy depositions of spatially close particles using a statistically consistent probabilistic procedure. The technique is nearly free of ad-hoc corrections, improves energy resolution, and provides new important handles that can improve the sensitivity of physics analyses: the uncertainty of the jet energy on an event-by-event basis and the estimate of the probability of a given particle hypothesis for a given detector response. When applied to the reconstruction of hadronic jets produced in the decays of tau leptons using the CDF-II detector at Fermilab, the method has demonstrated reliable and robust performance.Comment: Accepted by Nuclear Instruments and Methods

    Standard Model Higgs boson searches with the ATLAS detector at the Large Hadron Collider

    Get PDF
    The investigation of the mechanism responsible for electroweak symmetry breaking is one of the most important tasks of the scientific program of the Large Hadron Collider. The experimental results on the search of the Standard Model Higgs boson with 1 to 2 fb^-1 of proton proton collision data at sqrt s=7 TeV recorded by the ATLAS detector are presented and discussed. No significant excess of events is found with respect to the expectations from Standard Model processes, and the production of a Higgs boson is excluded at 95% Confidence Level for the mass regions 144-232, 256-282 and 296-466 GeV.Comment: Proceedings of the Lepton Photon 2011 Conference, to appear in "Pramana - journal of phsyics". 11 pages, 13 figure

    A data-driven method of pile-up correction for the substructure of massive jets

    Full text link
    We describe a method to measure and subtract the incoherent component of energy flow arising from multiple interactions from jet shape/substructure observables of ultra-massive jets. The amount subtracted is a function of the jet shape variable of interest and not a universal property. Such a correction is expected to significantly reduce any bias in the corresponding distributions generated by the presence of multiple interactions, and to improve measurement resolution. Since in our method the correction is obtained from the data, it is not subject to uncertainties coming from the use of theoretical calculations and/or Monte Carlo event generators. We derive our correction method for the jet mass, angularity and planar flow. We find these corrections to be in good agreement with data on massive jets observed by the CDF collaboration. Finally, we comment on the linkage with the concept of jet area and jet mass area.Comment: 7 pages and 3 figures, minor correction

    Observation of Exclusive Gamma Gamma Production in p pbar Collisions at sqrt{s}=1.96 TeV

    Full text link
    We have observed exclusive \gamma\gamma production in proton-antiproton collisions at \sqrt{s}=1.96 TeV, using data from 1.11 \pm 0.07 fb^{-1} integrated luminosity taken by the Run II Collider Detector at Fermilab. We selected events with two electromagnetic showers, each with transverse energy E_T > 2.5 GeV and pseudorapidity |\eta| < 1.0, with no other particles detected in -7.4 < \eta < +7.4. The two showers have similar E_T and azimuthal angle separation \Delta\phi \sim \pi; 34 events have two charged particle tracks, consistent with the QED process p \bar{p} to p + e^+e^- + \bar{p} by two-photon exchange, while 43 events have no charged tracks. The number of these events that are exclusive \pi^0\pi^0 is consistent with zero and is < 15 at 95% C.L. The cross section for p\bar{p} to p+\gamma\gamma+\bar{p} with |\eta(\gamma)| < 1.0 and E_T(\gamma) > 2.5$ GeV is 2.48^{+0.40}_{-0.35}(stat)^{+0.40}_{-0.51}(syst) pb.Comment: 7 pages, 4 figure

    Combined search for the standard model Higgs boson decaying to a bb pair using the full CDF data set

    Get PDF
    We combine the results of searches for the standard model Higgs boson based on the full CDF Run II data set obtained from sqrt(s) = 1.96 TeV p-pbar collisions at the Fermilab Tevatron corresponding to an integrated luminosity of 9.45/fb. The searches are conducted for Higgs bosons that are produced in association with a W or Z boson, have masses in the range 90-150 GeV/c^2, and decay into bb pairs. An excess of data is present that is inconsistent with the background prediction at the level of 2.5 standard deviations (the most significant local excess is 2.7 standard deviations).Comment: To be published in Phys. Rev. Lett (v2 contains minor updates based on comments from PRL

    Search for Neutral Higgs Bosons in Events with Multiple Bottom Quarks at the Tevatron

    Get PDF
    The combination of searches performed by the CDF and D0 collaborations at the Fermilab Tevatron Collider for neutral Higgs bosons produced in association with b quarks is reported. The data, corresponding to 2.6 fb-1 of integrated luminosity at CDF and 5.2 fb-1 at D0, have been collected in final states containing three or more b jets. Upper limits are set on the cross section multiplied by the branching ratio varying between 44 pb and 0.7 pb in the Higgs boson mass range 90 to 300 GeV, assuming production of a narrow scalar boson. Significant enhancements to the production of Higgs bosons can be found in theories beyond the standard model, for example in supersymmetry. The results are interpreted as upper limits in the parameter space of the minimal supersymmetric standard model in a benchmark scenario favoring this decay mode.Comment: 10 pages, 2 figure

    Shrinking a large dataset to identify variables associated with increased risk of Plasmodium falciparum infection in Western Kenya

    Get PDF
    Large datasets are often not amenable to analysis using traditional single-step approaches. Here, our general objective was to apply imputation techniques, principal component analysis (PCA), elastic net and generalized linear models to a large dataset in a systematic approach to extract the most meaningful predictors for a health outcome. We extracted predictors for Plasmodium falciparum infection, from a large covariate dataset while facing limited numbers of observations, using data from the People, Animals, and their Zoonoses (PAZ) project to demonstrate these techniques: data collected from 415 homesteads in western Kenya, contained over 1500 variables that describe the health, environment, and social factors of the humans, livestock, and the homesteads in which they reside. The wide, sparse dataset was simplified to 42 predictors of P. falciparum malaria infection and wealth rankings were produced for all homesteads. The 42 predictors make biological sense and are supported by previous studies. This systematic data-mining approach we used would make many large datasets more manageable and informative for decision-making processes and health policy prioritization
    • ‚Ķ
    corecore