2,057 research outputs found

    Federated Heterogeneous Compute and Storage Infrastructure for the PUNCH4NFDI Consortium

    Get PDF
    PUNCH4NFDI, funded by the Germany Research Foundation initially for five years, is a diverse consortium of particle, astro-, astroparticle, hadron and nuclear physics embedded in the National Research Data Infrastructure initiative. In order to provide seamless and federated access to the huge variety of compute and storage systems provided by the participating communities covering their very diverse needs, the Compute4PUNCH and Storage4PUNCH concepts have been developed. Both concepts comprise state-of-the-art technologies such as a token-based AAI for standardized access to compute and storage resources. The community supplied heterogeneous HPC, HTC and Cloud compute resources are dynamically and transparently integrated into one federated HTCondorbased overlay batch system using the COBalD/TARDIS resource meta-scheduler. Traditional login nodes and a JupyterHub provide entry points into the entire landscape of available compute resources, while container technologies and the CERN Virtual Machine File System (CVMFS) ensure a scalable provisioning of community-specific software environments. In Storage4PUNCH, community supplied storage systems mainly based on dCache or XRootD technology are being federated in a common infrastructure employing methods that are well established in the wider HEP community. Furthermore existing technologies for caching as well as metadata handling are being evaluated with the aim for a deeper integration. The combined Compute4PUNCH and Storage4PUNCH environment will allow a large variety of researchers to carry out resource-demanding analysis tasks. In this contribution we will present the Compute4PUNCH and Storage4PUNCH concepts, the current status of the developments as well as first experiences with scientific applications being executed on the available prototypes

    Repurposing of the Run 2 CMS High Level Trigger Infrastructure as a Cloud Resource for Offline Computing

    Get PDF
    The former CMS Run 2 High Level Trigger (HLT) farm is one of the largest contributors to CMS compute resources, providing about 25k job slots for offline computing. This CPU farm was initially employed as an opportunistic resource, exploited during inter-fill periods, in the LHC Run 2. Since then, it has become a nearly transparent extension of the CMS capacity at CERN, being located on-site at the LHC interaction point 5 (P5), where the CMS detector is installed. This resource has been configured to support the execution of critical CMS tasks, such as prompt detector data reconstruction. It can therefore be used in combination with the dedicated Tier 0 capacity at CERN, in order to process and absorb peaks in the stream of data coming from the CMS detector. The initial configuration for this resource, based on statically configured VMs, provided the required level of functionality. However, regular operations of this cluster revealed certain limitations compared to the resource provisioning and use model employed in the case of WLCG sites. A new configuration, based on a vacuum-like model, has been implemented for this resource in order to solve the detected shortcomings. This paper reports about this redeployment work on the permanent cloud for an enhanced support to CMS offline computing, comparing the former and new models’ respective functionalities, along with the commissioning effort for the new setup

    A historically controlled, single-arm, multi-centre, prospective trial to evaluate the safety and efficacy of MonoMaxÂź suture material for abdominal wall closure after primary midline laparotomy. ISSAAC-Trial [NCT005725079]

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Several randomized controlled trials have compared different suture materials and techniques for abdominal wall closure with respect to the incidence of incisional hernias after midline laparotomy and shown that it remains, irrespective of the methods used, considerably high, ranging from 9% to 20%. The development of improved suture materials which would reduce postoperative complications may help to lower its frequency.</p> <p>Design</p> <p>This is a historically controlled, single-arm, multi-centre, prospective trial to evaluate the safety of MonoMax<sup>Ÿ </sup>suture material for abdominal wall closure in 150 patients with primary elective midline incisions. INSECT patients who underwent abdominal closure using Monoplus<sup>Ÿ </sup>and PDS<sup>Ÿ </sup>will serve as historical control group. The incidences of wound infections and of burst abdomen are defined as composite primary endpoints. Secondary endpoints are the frequency of incisional hernias within one year after operation and safety. To ensure adequate comparability in surgical performance and recruitment, the 4 largest centres of the INSECT-Trial will participate. After hospital discharge, the investigators will examine the enrolled patients again at 30 days and at 12 ± 1 months after surgery.</p> <p>Conclusion</p> <p>This historically controlled, single-arm, multi-centre, prospective ISSAAC trial aims to assess whether the use of an ultra-long-lasting absorbable monofilament suture material is safe and efficient.</p> <p>Trial registration</p> <p>NCT005725079</p

    Searches at HERA for Squarks in R-Parity Violating Supersymmetry

    Get PDF
    A search for squarks in R-parity violating supersymmetry is performed in e^+p collisions at HERA at a centre of mass energy of 300 GeV, using H1 data corresponding to an integrated luminosity of 37 pb^(-1). The direct production of single squarks of any generation in positron-quark fusion via a Yukawa coupling lambda' is considered, taking into account R-parity violating and conserving decays of the squarks. No significant deviation from the Standard Model expectation is found. The results are interpreted in terms of constraints within the Minimal Supersymmetric Standard Model (MSSM), the constrained MSSM and the minimal Supergravity model, and their sensitivity to the model parameters is studied in detail. For a Yukawa coupling of electromagnetic strength, squark masses below 260 GeV are excluded at 95% confidence level in a large part of the parameter space. For a 100 times smaller coupling strength masses up to 182 GeV are excluded.Comment: 32 pages, 14 figures, 3 table

    Measurements of Transverse Energy Flow in Deep-Inelastic Scattering at HERA

    Full text link
    Measurements of transverse energy flow are presented for neutral current deep-inelastic scattering events produced in positron-proton collisions at HERA. The kinematic range covers squared momentum transfers Q^2 from 3.2 to 2,200 GeV^2, the Bjorken scaling variable x from 8.10^{-5} to 0.11 and the hadronic mass W from 66 to 233 GeV. The transverse energy flow is measured in the hadronic centre of mass frame and is studied as a function of Q^2, x, W and pseudorapidity. A comparison is made with QCD based models. The behaviour of the mean transverse energy in the central pseudorapidity region and an interval corresponding to the photon fragmentation region are analysed as a function of Q^2 and W.Comment: 26 pages, 8 figures, submitted to Eur. Phys.

    The Warburg Effect Suppresses Oxidative Stress Induced Apoptosis in a Yeast Model for Cancer

    Get PDF
    BACKGROUND: Otto Warburg observed that cancer cells are often characterized by intense glycolysis in the presence of oxygen and a concomitant decrease in mitochondrial respiration. Research has mainly focused on a possible connection between increased glycolysis and tumor development whereas decreased respiration has largely been left unattended. Therefore, a causal relation between decreased respiration and tumorigenesis has not been demonstrated. METHODOLOGY/PRINCIPAL FINDINGS: For this purpose, colonies of Saccharomyces cerevisiae, which is suitable for manipulation of mitochondrial respiration and shows mitochondria-mediated cell death, were used as a model. Repression of respiration as well as ROS-scavenging via glutathione inhibited apoptosis and conferred a survival advantage during seeding and early development of this fast proliferating solid cell population. In contrast, enhancement of respiration triggered cell death. CONCLUSION/SIGNIFICANCE: Thus, the Warburg effect might directly contribute to the initiation of cancer formation--not only by enhanced glycolysis--but also via decreased respiration in the presence of oxygen, which suppresses apoptosis

    Guidelines and Recommendations on Yeast Cell Death Nomenclature

    Get PDF
    Elucidating the biology of yeast in its full complexity has major implications for science, medicine and industry. One of the most critical processes determining yeast life and physiology is cellular demise. However, the investigation of yeast cell death is a relatively young field, and a widely accepted set of concepts and terms is still missing. Here, we propose unified criteria for the definition of accidental, regulated, and programmed forms of cell death in yeast based on a series of morphological and biochemical criteria. Specifically, we provide consensus guidelines on the differential definition of terms including apoptosis, regulated necrosis, and autophagic cell death, as we refer to additional cell death routines that are relevant for the biology of (at least some species of) yeast. As this area of investigation advances rapidly, changes and extensions to this set of recommendations will be implemented in the years to come. Nonetheless, we strongly encourage the authors, reviewers and editors of scientific articles to adopt these collective standards in order to establish an accurate framework for yeast cell death research and, ultimately, to accelerate the progress of this vibrant field of research

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe
    • 

    corecore