275 research outputs found

    Coordinated Caching for High Performance Calibration using Z -> µµ Events of the CMS Experiment

    Get PDF
    Calibration of the detectors is a prerequisite for almost all physics analyses conducted as part of the LHC experiment. As such, both speed and precision are critical. As part of this thesis, a high performance analysis infrastructure using coordinated caching has been developed. This has been used to conduct the first calibration of jets using Z -> µµ events recorded during the second LHC run at the CMS experiment

    Analysis of Standard Model Higgs Boson Decays to Tau Pairs with the CMS Detector at the LHC

    Get PDF
    The search for the SM Higgs boson decaying tau leptons performed at the complete CMS run I data set is presented with a particular focus on the analysis of the di-muon final state. The published CMS H->tautau analysis is summarised and an outlook to future measurements in this channel is given by extrapolating the H->tautau analysis to the expected luminosity of the present 13 TeV data taking period. Here, only the most sensitive channels mu-tau, e-tau, e-mu were used

    Simulating the Mammalian Blastocyst - Molecular and Mechanical Interactions Pattern the Embryo

    Get PDF
    Mammalian embryogenesis is a dynamic process involving gene expression and mechanical forces between proliferating cells. The exact nature of these interactions, which determine the lineage patterning of the trophectoderm and endoderm tissues occurring in a highly regulated manner at precise periods during the embryonic development, is an area of debate. We have developed a computational modeling framework for studying this process, by which the combined effects of mechanical and genetic interactions are analyzed within the context of proliferating cells. At a purely mechanical level, we demonstrate that the perpendicular alignment of the animal-vegetal (a-v) and embryonic-abembryonic (eb-ab) axes is a result of minimizing the total elastic conformational energy of the entire collection of cells, which are constrained by the zona pellucida. The coupling of gene expression with the mechanics of cell movement is important for formation of both the trophectoderm and the endoderm. In studying the formation of the trophectoderm, we contrast and compare quantitatively two hypotheses: (1) The position determines gene expression, and (2) the gene expression determines the position. Our model, which couples gene expression with mechanics, suggests that differential adhesion between different cell types is a critical determinant in the robust endoderm formation. In addition to differential adhesion, two different testable hypotheses emerge when considering endoderm formation: (1) A directional force acts on certain cells and moves them into forming the endoderm layer, which separates the blastocoel and the cells of the inner cell mass (ICM). In this case the blastocoel simply acts as a static boundary. (2) The blastocoel dynamically applies pressure upon the cells in contact with it, such that cell segregation in the presence of differential adhesion leads to the endoderm formation. To our knowledge, this is the first attempt to combine cell-based spatial mechanical simulations with genetic networks to explain mammalian embryogenesis. Such a framework provides the means to test hypotheses in a controlled in silico environment

    Jet Momentum Resolution for the CMS Experiment and Distributed Data Caching Strategies

    Get PDF
    Accurately measured jets are mandatory for precision measurements of the Standard Model of particle physics as well as for searches for new physics. The increased instantaneous luminosity and center-of-mass energy at LHC Run 2 pose challenges for pileup mitigation and the measurement of jet characteristics. This thesis concentrates on using Z + jets events to calibrate the energy scale of jets recorded by the CMS detector in 2018. Furthermore, it proposes a new procedure for determining the jet momentum resolution using Z + jets events. This procedure is expected to allow cross-checking complementary measurement approaches and increasing the accuracy of the jet momentum resolution at the CMS experiment. Data-intensive end-user analyses in High Energy Physics such as the presented calibration of jets put enormous challenges on the computing infrastructure since requiring high data throughput. Besides the particle physics analysis, this thesis also focuses on accelerating data processing within a distributed computing infrastructure via a coordinated distributed caching approach. Coordinated placement of critical data within distributed caches and matching workflows to the most suitable host in terms of cached data allows for optimizing processing efficiency. Improving the processing of data-intensive workflows aims at shortening turnaround cycles and thus deriving physics results, e.g. the jet calibration results, faster

    Receding horizon control for oil reservoir waterflooding process

    Get PDF
    Waterflooding is a recovery technique where water is pumped into an oil reservoir for increase in production. Changing reservoir states will require different injection and production settings for optimal operation which can be formulated as a dynamic optimization problem. This could be solved through optimal control techniques which traditionally can only provide an open-loop solution. However, this solution is sensitive to uncertainties which is inevitable to reservoirs. Direct feedback control has been proposed recently for optimal waterflooding operations with the aim to counteract the effects of reservoir uncertainties. In this work, a feedback approach based on the principle of receding horizon control (RHC) was developed for waterflooding process optimization. Application of RHC strategy to counteract the effect of uncertainties has yielded gains that vary from 0.14% to 19.22% over the traditional open-loop approach. The gain increases with introduction of more uncertainties into the configuration. The losses incurred as a result of the effect of feedback is in the range of 0.25%–15.21% in comparison to 0.39%–31.51% for the case of traditional open-loop control approach

    A data-driven method for Higgs boson analyses in di-Ï„ final states for the LHC Run II and beyond

    Get PDF
    Das τ-Embedding ist eine datenbasierte Methode zur Abschätzung des Beitrags von Prozessen mit zwei τ-Leptonen im Ereignis. Die Methode verwendet einen ereignisbasier- ten Ansatz, bei dem zwei rekonstruierte Myonen in den Daten ausgewählt werden, die durch zwei simulierte τ-Leptonenzerfälle ersetzt werden. Das daraus resultierende Ereignis vereint die simulierten τ-Leptonenzerfälle mit einem sonst unveränderten Ereignis. Das τ-Embedding führt zu einer verbesserten Beschreibung der Eigenschaften von Jets und von Pile-up-Kollisionen. Es ist die wichtigste Abschätzungsmethode für Untergründe mit zwei τ-Leptonen im Endzustand innerhalb der CMS-Kollaboration und wurde in den letzten Jahren in zahlreichen Higgs-Boson-Analysen in ττ-Endzuständen angewendet. In dieser Arbeit wird die neueste Implementierung der Methode beschrieben. In einem umfassenden, Analysebeispiel wird die Methode mit einem Modell verglichen, das auf vollständig simulierten Prozessen basiert. Mehr als 8 Millionen CPU-Stunden wurden auf- gewendet, um die neue Implementierung von τ-Embedding Ergebnisse für die LHC Run II Analysen zu erzeugen. Die vorgestellten Studien legen den Grundstein für die Verwendung von τ-Embedding in mehreren geplanten Higgs-Boson-Analysen in ττ-Endzuständen auf den kombinierten Datensätzen von Run II und III, die eines der wichtigsten Ergebnisse des LHC-Phase-1-Physikprogramms darstellen werden

    Supporting group maintenance through prognostics-enhanced dynamic dependability prediction

    Get PDF
    Condition-based maintenance strategies adapt maintenance planning through the integration of online condition monitoring of assets. The accuracy and cost-effectiveness of these strategies can be improved by integrating prognostics predictions and grouping maintenance actions respectively. In complex industrial systems, however, effective condition-based maintenance is intricate. Such systems are comprised of repairable assets which can fail in different ways, with various effects, and typically governed by dynamics which include time-dependent and conditional events. In this context, system reliability prediction is complex and effective maintenance planning is virtually impossible prior to system deployment and hard even in the case of condition-based maintenance. Addressing these issues, this paper presents an online system maintenance method that takes into account the system dynamics. The method employs an online predictive diagnosis algorithm to distinguish between critical and non-critical assets. A prognostics-updated method for predicting the system health is then employed to yield well-informed, more accurate, condition-based suggestions for the maintenance of critical assets and for the group-based reactive repair of non-critical assets. The cost-effectiveness of the approach is discussed in a case study from the power industry

    Data locality via coordinated caching for distributed processing

    Get PDF
    • …
    corecore