5 research outputs found

    Accelerator Physics Issues of the LHC

    Get PDF
    In order to compensate for the scarcity of events at very high energy the LHC has to provide a luminosity of 1034 cm-2 s-1. This is obtained with a large beam current distributed over 2835 particle bunches, and a large transverse bunch density so as to operate close to the beam-beam limit. The beam-beam interaction has two components, the head-on interaction as in previous colliders with few bunches and the long range interaction due to multiple unwanted crossings. This last effect is controlled by letting the beams collide at a small angle. The single bunch and multibunch collective instabilities are kept under control by a proper design of the beam enclosure and by feedback systems. The unavoidable imperfections of the high field superconducting magnets create non-linear field errors which limit the useful range of particle betatron amplitudes where the motion is stable, the so-called Dynamic Aperture. An extended set of corrector magnets is foreseen to compensate for the effects of the strongest multipoles of low order. The machine lattice is designed with the aim of leaving sufficient freedom in the choice of the operating conditions to optimize performance

    Testing the Universality of the Fundamental Metallicity Relation at High Redshift Using Low-mass Gravitationally Lensed Galaxies

    Get PDF
    We present rest-frame optical spectra for a sample of nine low-mass star-forming galaxies in the redshift range 1.5 = 0.01 ± 0.08, suggesting a universal relationship. Remarkably, the scatter around the fundamental metallicity relation is only 0.24 dex, smaller than that observed locally at the same stellar masses, which may provide an important additional constraint for galaxy evolution models

    Machine layout and performance

    Get PDF
    The Large Hadron Collider (LHC) is one of the largest scientific instruments ever built. Since opening up a new energy frontier for exploration in 2010, it has gathered a global user community of about 7,000 scientists working in fundamental particle physics and the physics of hadronic matter at extreme temperature and density. To sustain and extend its discovery potential, the LHC will need a major upgrade in the 2020s. This will increase its luminosity (rate of collisions) by a factor of five beyond the original design value and the integrated luminosity (total collisions created) by a factor ten. The LHC is already a highly complex and exquisitely optimised machine so this upgrade must be carefully conceived and will require about ten years to implement. The new configuration, known as High Luminosity LHC (HL-LHC), will rely on a number of key innovations that push accelerator technology beyond its present limits. Among these are cutting-edge 11-12 tesla superconducting magnets, compact superconducting cavities for beam rotation with ultra-precise phase control, new technology and physical processes for beam collimation and 300 metre-long high-power superconducting links with negligible energy dissipation. The present document describes the technologies and components that will be used to realise the project and is intended to serve as the basis for the detailed engineering design of HL-LHC

    Thesaurus-Based Methodologies and Tools for Maintaining Persistent Application Systems

    Get PDF
    The research presented in this thesis establishes thesauri as a viable foundation for models, methodologies and tools for change management. Most of the research has been undertaken in a persistent programming environment. Persistent language technology has enabled the construction of sophisticated and well-integrated change management tools; tools and applications reside in the same store. At the same time, the research has enhanced persistent programming environments with models, methodologies and tools that are crucial to the exploitation of persistent programming in construction and maintenance of long-lived, data-intensive application systems

    Multicommodity Network Flows with Probabilistic Losses

    No full text
    This paper considers the problem of maximizing the expected value of multicommodity flows in a network in which the arcs experience probabilistic loss rates. Consideration of probabilistic losses are relevant, particularly, in communication and transportation networks. An arc-chain formulation of the problem and an efficient algorithm for computing an optimal solution are provided. The algorithm involves a modified column generation technique for identifying a constrained chain. Computational experience with the algorithm is included.networks: flow, algorithms/stochastic
    corecore