119 research outputs found
Dynamic simulations in SixTrack
The DYNK module allows element settings in SixTrack to be changed on a
turn-by-turn basis. This document contains a technical description of the DYNK
module in SixTrack. It is mainly intended for a developer or advanced user who
wants to modify the DYNK module, for example by adding more functions that can
be used to calculate new element settings, or to add support for new elements
that can be used with DYNK.Comment: Submission to CERN yellow report / conference proceeding, the 2015
collimation tracking code worksho
90 m optics commissioning
http://accelconf.web.cern.ch/AccelConf/IPAC2011/papers/tupz001.pdfInternational audienceSpecial β∗ = 90 m optics have been developed for the two very high luminosity insertions of the LHC [1] [2], as a rst step to allow for very low angle precision measure- ments of the proton-proton collisions in the LHC. These optics were developed to be compatible with the stan- dard LHC injection and ramp optics. The target value of β∗ = 90 m is reached by an un-squeeze from the injection β∗ = 11 m. We report about the implementation of this op- tics and the rst experience gained in commissioning with beam during two machine studies
A Large Hadron Electron Collider at CERN
This document provides a brief overview of the recently published report on
the design of the Large Hadron Electron Collider (LHeC), which comprises its
physics programme, accelerator physics, technology and main detector concepts.
The LHeC exploits and develops challenging, though principally existing,
accelerator and detector technologies. This summary is complemented by brief
illustrations of some of the highlights of the physics programme, which relies
on a vastly extended kinematic range, luminosity and unprecedented precision in
deep inelastic scattering. Illustrations are provided regarding high precision
QCD, new physics (Higgs, SUSY) and electron-ion physics. The LHeC is designed
to run synchronously with the LHC in the twenties and to achieve an integrated
luminosity of O(100) fb. It will become the cleanest high resolution
microscope of mankind and will substantially extend as well as complement the
investigation of the physics of the TeV energy scale, which has been enabled by
the LHC
Machine layout and performance
The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new
energy frontier for exploration in 2010, it has gathered a global user community of about 7,000 scientists working
in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain
and extend its discovery potential, the LHC will need a major upgrade in the 2020s. This will increase its luminosity
(rate of collisions) by a factor of five beyond the original design value and the integrated luminosity (total
collisions created) by a factor ten. The LHC is already a highly complex and exquisitely optimised machine so this
upgrade must be carefully conceived and will require about ten years to implement. The new configuration, known
as High Luminosity LHC (HL-LHC), will rely on a number of key innovations that push accelerator technology
beyond its present limits. Among these are cutting-edge 11-12 tesla superconducting magnets, compact superconducting
cavities for beam rotation with ultra-precise phase control, new technology and physical processes
for beam collimation and 300 metre-long high-power superconducting links with negligible energy dissipation.
The present document describes the technologies and components that will be used to realise the project and is
intended to serve as the basis for the detailed engineering design of HL-LHC
Policing, crime and ‘big data’; towards a critique of the moral economy of stochastic governance
Process management in hospitals: an empirically grounded maturity model
In order to improve transparency and stabilise health care costs, several countries have decided to reform their healthcare system on the basis of diagnosis-related groups (DRG). DRGs are not only used for classifying medical treatments, but also for case-based reimbursement, hence induce active competition among hospitals, forcing them to become more efficient and effective. In consequence, hospitals are investing considerably in process orientation and management. However, to date there is neither a consensus on what capabilities hospitals need to acquire for becoming process-oriented, nor a general agreement on the sequence of development stages they have to traverse. To this end, this study proposes an empirically grounded conceptualisation of process management capabilities and presents a staged capability maturity model algorithmically derived on the basis of empirical data from 129 acute somatic hospitals in Switzerland. The five capability maturity levels start with 'encouragement of process orientation' (level 1), 'case-by-case handling' (level 2), and 'defined processes' (level 3). Ultimately, hospitals can reach the levels 'occasional corrective action' (level 4) and 'closed loop improvement' (level 5). The empirically derived model reveals why existing, generic capability maturity models for process management are not applicable in the hospitals context: their comparatively high complexity on the one hand and their strong focus on topics like an adequate IT integration and process automation on the other make them inadequate for solving the problems felt in the hospital sector, which are primarily of cultural and structural nature. We deem the proposed capability maturity model capable to overcome these shortcomings
- …
