19,565 research outputs found

    STV-based Video Feature Processing for Action Recognition

    Get PDF
    In comparison to still image-based processes, video features can provide rich and intuitive information about dynamic events occurred over a period of time, such as human actions, crowd behaviours, and other subject pattern changes. Although substantial progresses have been made in the last decade on image processing and seen its successful applications in face matching and object recognition, video-based event detection still remains one of the most difficult challenges in computer vision research due to its complex continuous or discrete input signals, arbitrary dynamic feature definitions, and the often ambiguous analytical methods. In this paper, a Spatio-Temporal Volume (STV) and region intersection (RI) based 3D shape-matching method has been proposed to facilitate the definition and recognition of human actions recorded in videos. The distinctive characteristics and the performance gain of the devised approach stemmed from a coefficient factor-boosted 3D region intersection and matching mechanism developed in this research. This paper also reported the investigation into techniques for efficient STV data filtering to reduce the amount of voxels (volumetric-pixels) that need to be processed in each operational cycle in the implemented system. The encouraging features and improvements on the operational performance registered in the experiments have been discussed at the end

    On Statistical Aspects of Qjets

    Get PDF
    The process by which jet algorithms construct jets and subjets is inherently ambiguous and equally well motivated algorithms often return very different answers. The Qjets procedure was introduced by the authors to account for this ambiguity by considering many reconstructions of a jet at once, allowing one to assign a weight to each interpretation of the jet. Employing these weighted interpretations leads to an improvement in the statistical stability of many measurements. Here we explore in detail the statistical properties of these sets of weighted measurements and demonstrate how they can be used to improve the reach of jet-based studies.Comment: 29 pages, 6 figures. References added, minor modification of the text. This version to appear in JHE

    Jet Substructure at the Tevatron and LHC: New results, new tools, new benchmarks

    Get PDF
    In this report we review recent theoretical progress and the latest experimental results in jet substructure from the Tevatron and the LHC. We review the status of and outlook for calculation and simulation tools for studying jet substructure. Following up on the report of the Boost 2010 workshop, we present a new set of benchmark comparisons of substructure techniques, focusing on the set of variables and grooming methods that are collectively known as "top taggers". To facilitate further exploration, we have attempted to collect, harmonise, and publish software implementations of these techniques.Comment: 53 pages, 17 figures. L. Asquith, S. Rappoccio, C. K. Vermilion, editors; v2: minor edits from journal revision

    Structure of Fat Jets at the Tevatron and Beyond

    Full text link
    Boosted resonances is a highly probable and enthusiastic scenario in any process probing the electroweak scale. Such objects when decaying into jets can easily blend with the cornucopia of jets from hard relative light QCD states. We review jet observables and algorithms that can contribute to the identification of highly boosted heavy jets and the possible searches that can make use of such substructure information. We also review previous studies by CDF on boosted jets and its measurements on specific jet shapes.Comment: invited review for a special "Top and flavour physics in the LHC era" issue of The European Physical Journal C, we invite comments regarding contents of the review; v2 added references and institutional preprint number

    Blending bias impacts the host halo masses derived from a cross-correlation analysis of bright sub-millimetre galaxies

    Get PDF
    Placing bright sub-millimetre galaxies (SMGs) within the broader context of galaxy formation and evolution requires accurate measurements of their clustering, which can constrain the masses of their host dark matter halos. Recent work has shown that the clustering measurements of these galaxies may be affected by a `blending bias,' which results in the angular correlation function of the sources extracted from single-dish imaging surveys being boosted relative to that of the underlying galaxies. This is due to confusion introduced by the coarse angular resolution of the single-dish telescope and could lead to the inferred halo masses being significantly overestimated. We investigate the extent to which this bias affects the measurement of the correlation function of SMGs when it is derived via a cross-correlation with a more abundant galaxy population. We find that the blending bias is essentially the same as in the auto-correlation case and conclude that the best way to reduce its effects is to calculate the angular correlation function using SMGs in narrow redshift bins. Blending bias causes the inferred host halo masses of the SMGs to be overestimated by a factor of 6\sim6 when a redshift interval of δz=3\delta z=3 is used. However, this reduces to a factor of 2\sim2 for δz=0.5\delta z=0.5. The broadening of photometric redshift probability distributions with increasing redshift can therefore impart a mild halo `downsizing' effect onto the inferred host halo masses, though this trend is not as strong as seen in recent observational studies.Comment: 10 pages, 9 figures, 1 table. Accepted to MNRA

    Resolving Boosted Jets with XCone

    Get PDF
    We show how the recently proposed XCone jet algorithm smoothly interpolates between resolved and boosted kinematics. When using standard jet algorithms to reconstruct the decays of hadronic resonances like top quarks and Higgs bosons, one typically needs separate analysis strategies to handle the resolved regime of well-separated jets and the boosted regime of fat jets with substructure. XCone, by contrast, is an exclusive cone jet algorithm that always returns a fixed number of jets, so jet regions remain resolved even when (sub)jets are overlapping in the boosted regime. In this paper, we perform three LHC case studies---dijet resonances, Higgs decays to bottom quarks, and all-hadronic top pairs---that demonstrate the physics applications of XCone over a wide kinematic range.Comment: 36 pages, 25 figures, 1 table; v2: references added; v3: discussion added and new appendix B to match JHEP versio

    ArborZ: Photometric Redshifts Using Boosted Decision Trees

    Full text link
    Precision photometric redshifts will be essential for extracting cosmological parameters from the next generation of wide-area imaging surveys. In this paper we introduce a photometric redshift algorithm, ArborZ, based on the machine-learning technique of Boosted Decision Trees. We study the algorithm using galaxies from the Sloan Digital Sky Survey and from mock catalogs intended to simulate both the SDSS and the upcoming Dark Energy Survey. We show that it improves upon the performance of existing algorithms. Moreover, the method naturally leads to the reconstruction of a full probability density function (PDF) for the photometric redshift of each galaxy, not merely a single "best estimate" and error, and also provides a photo-z quality figure-of-merit for each galaxy that can be used to reject outliers. We show that the stacked PDFs yield a more accurate reconstruction of the redshift distribution N(z). We discuss limitations of the current algorithm and ideas for future work.Comment: 10 pages, 13 figures, submitted to Ap

    Convolved Substructure: Analytically Decorrelating Jet Substructure Observables

    Full text link
    A number of recent applications of jet substructure, in particular searches for light new particles, require substructure observables that are decorrelated with the jet mass. In this paper we introduce the Convolved SubStructure (CSS) approach, which uses a theoretical understanding of the observable to decorrelate the complete shape of its distribution. This decorrelation is performed by convolution with a shape function whose parameters and mass dependence are derived analytically. We consider in detail the case of the D2D_2 observable and perform an illustrative case study using a search for a light hadronically decaying ZZ'. We find that the CSS approach completely decorrelates the D2D_2 observable over a wide range of masses. Our approach highlights the importance of improving the theoretical understanding of jet substructure observables to exploit increasingly subtle features for performance.Comment: 20 pages, 11 figures. v2. Corrected typo in legend in Figure 5. Updated Figure 11, minor modification to conclusions on discrimination power. v3. Updated to published version. Minor typos correcte

    Luminosity Functions of Lyman Alpha Emitting Galaxies and Cosmic Reionization of Hydrogen

    Full text link
    Recent observations imply that the observed number counts of Lya Emitters (LAEs) evolved significantly between z=5.7 and z=6.5. It has been suggested that this was due to a rapid evolution in the ionisation state, and hence transmission of the IGM which caused Lya flux from z=6.5 galaxies to be more strongly suppressed. In this paper we consider the joint evolution of the Lya and UV luminosity functions (LFs) and show that the IGM transmission evolved between z=6.5 and z=5.7 by a factor 1.1 <R < 1.8 (95% CL). This result is insensitive to the underlying model of the Lya LF (as well as cosmic variance). Using a model for IGM transmission, we find that the evolution of the mean IGM density through cosmic expansion alone may result in a value for the ratio of transmissions as high as R=1.3. Thus, the existing LFs do not provide evidence for overlap. Furthermore, the constraint R<1.8 suggests that the Universe at z=6.5 was more than half ionised by volume, i.e. x_i,V>0.5.Comment: MNRAS in press. Constraints from rest-frame UV LF added. Discussion added on cosmic variance. Lower limit on x_i,V lowered to 0.5 (from 0.8

    LEAD Program Evaluation: Recidivism Report

    Get PDF
    The LEAD program was established in 2011 as a means of diverting those suspected of low-level drug and prostitution criminal activity to case management and other supportive services instead of jail and prosecution. The primary aim of the LEAD program is to reduce criminal recidivism. Secondary aims include reductions in criminal justice service utilization and associated costs as well as improvements for psychosocial, housing and quality-of-life outcomes. Because LEAD is the first known pre-booking diversion program of its kind in the United States, an evaluation is critically needed to inform key stakeholders, policy makers, and other interested parties of its impact. The evaluation of the LEAD program described in this report represents a response to this need.Background: This report was written by the University of Washington LEAD Evaluation Team at the request of the LEAD Policy Coordinating Group and fulfills the first of three LEAD evaluation aims. Purpose: This report describes findings from a quantitative analysis comparing outcomes for LEAD participants versus "system-as-usual" control participants on shorter- and longer-term changes on recidivism outcomes, including arrests (i.e., being taken into custody by legal authority) and criminal charges (i.e., filing of a criminal case in court). Arrests and criminal charges were chosen as the recidivism outcomes because they likely reflect individual behavior more than convictions, which are more heavily impacted by criminal justice system variables external to the individual. Findings: Analyses indicated statistically significant recidivism improvement for the LEAD group compared to the control group on some shorter- and longer-term outcomes
    corecore