286 research outputs found

    Compositional Uncertainty in Models of Alignment

    Get PDF

    Event generation with SHERPA 1.1

    Full text link
    In this paper the current release of the Monte Carlo event generator Sherpa, version 1.1, is presented. Sherpa is a general-purpose tool for the simulation of particle collisions at high-energy colliders. It contains a very flexible tree-level matrix-element generator for the calculation of hard scattering processes within the Standard Model and various new physics models. The emission of additional QCD partons off the initial and final states is described through a parton-shower model. To consistently combine multi-parton matrix elements with the QCD parton cascades the approach of Catani, Krauss, Kuhn and Webber is employed. A simple model of multiple interactions is used to account for underlying events in hadron--hadron collisions. The fragmentation of partons into primary hadrons is described using a phenomenological cluster-hadronisation model. A comprehensive library for simulating tau-lepton and hadron decays is provided. Where available form-factor models and matrix elements are used, allowing for the inclusion of spin correlations; effects of virtual and real QED corrections are included using the approach of Yennie, Frautschi and Suura.Comment: 47 pages, 21 figure

    Variational inference for latent variables and uncertain inputs in Gaussian processes

    Get PDF
    The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dimensionality reduction that has been widely applied. However, the current approach for training GP-LVMs is based on maximum likelihood, where the latent projection variables are maximised over rather than integrated out. In this paper we present a Bayesian method for training GP-LVMs by introducing a non-standard variational inference framework that allows to approximately integrate out the latent variables and subsequently train a GP-LVM by maximising an analytic lower bound on the exact marginal likelihood. We apply this method for learning a GP-LVM from i.i.d. observations and for learning non-linear dynamical systems where the observations are temporally correlated. We show that a benefit of the variational Bayesian procedure is its robustness to overfitting and its ability to automatically select the dimensionality of the non-linear latent space. The resulting framework is generic, flexible and easy to extend for other purposes, such as Gaussian process regression with uncertain or partially missing inputs. We demonstrate our method on synthetic data and standard machine learning benchmarks, as well as challenging real world datasets, including high resolution video data.This research was partially funded by the European research project EU FP7-ICT (Project Ref 612139 \WYSIWYD"), the Greek State Scholarships Foundation (IKY) and the University of She eld Moody endowment fund. We also thank Colin Litster and \Fit Fur Life" for allowing us to use their video les as datasets

    Data driving the top quark forward--backward asymmetry with a lepton-based handle

    Full text link
    We propose that, within the standard model, the correlation between the ttˉt\bar{t} forward--backward asymmetry AttˉA_{t\bar t} and the corresponding lepton-based asymmetry AlA_l -- at the differential level -- is strong and rather clean both theoretically and experimentally. Hence a combined measurement of the two distributions as a function of the lepton pTp_T, a direct and experimentally clean observable, would lead to a potentially unbiased and normalization-free test of the standard model prediction. To check the robustness of our proposal we study how the correlation is affected by mis-measurement of the ttˉt\bar t system transverse momenta, acceptance cuts, scale dependence and compare the results of MCFM, POWHEG (with & without PYTHIA showering), and SHERPA's CSSHOWER in first-emission mode. We find that the shape of the relative differential distribution Al(pTl)[Attˉ(pTl)]A_{l} (p^{l}_{T}) [A_{t\bar{t}} (p^l_T)] is only moderately distorted hence supporting the usefulness of our proposal. Beyond the first emission, we find that the correlation is not accurately captured by lowest-order treatment. We also briefly consider other differential variables such as the system transverse mass and the canonical ttˉt\bar t invariant mass. Finally, we study new physics scenarios where the correlation is significantly distorted and therefore can be more readily constrained or discovered using our method.Comment: 27 pages, 12 figure

    Incremental volume rendering using hierarchical compression

    Get PDF
    Includes bibliographical references.The research has been based on the thesis that efficient volume rendering of datasets, contained on the Internet, can be achieved on average personal workstations. We present a new algorithm here for efficient incremental rendering of volumetric datasets. The primary goal of this algorithm is to give average workstations the ability to efficiently render volume data received over relatively low bandwidth network links in such a way that rapid user feedback is maintained. Common limitations of workstation rendering of volume data include: large memory overheads, the requirement of expensive rendering hardware, and high speed processing ability. The rendering algorithm presented here overcomes these problems by making use of the efficient Shear-Warp Factorisation method which does not require specialised graphics hardware. However the original Shear-Warp algorithm suffers from a high memory overhead and does not provide for incremental rendering which is required should rapid user feedback be maintained. Our algorithm represents the volumetric data using a hierarchical data structure which provides for the incremental classification and rendering of volume data. This exploits the multiscale nature of the octree data structure. The algorithm reduces the memory footprint of the original Shear-Warp Factorisation algorithm by a factor of more than two, while maintaining good rendering performance. These factors make our octree algorithm more suitable for implementation on average desktop workstations for the purposes of interactive exploration of volume models over a network. This dissertation covers the theory and practice of developing the octree based Shear-Warp algorithms, and then presents the results of extensive empirical testing. The results, using typical volume datasets, demonstrate the ability of the algorithm to achieve high rendering rates for both incremental rendering and standard rendering while reducing the runtime memory requirements
    • …
    corecore