38,737 research outputs found

    Predictions for the heavy-ion programme at the Large Hadron Collider

    Full text link
    I review the main predictions for the heavy-ion programme at the Large Hadron Collider (LHC) at CERN, as available in early April 2009. I begin by remembering the standard claims made in view of the experimental data measured at the Super Proton Synchrotron (SPS) at CERN and at the Relativistic Heavy Ion Collider (RHIC) at the BNL. These claims will be used for later discussion of the new opportunities at the LHC. Next I review the generic, qualitative expectations for the LHC. Then I turn to quantitative predictions: First I analyze observables which characterize directly the medium produced in the collisions - bulk observables or soft probes -: multiplicities, collective flow, hadrochemistry at low transverse momentum, correlations and fluctuations. Second, I move to calibrated probes of the medium i.e. typically those whose expectation in the absence of any medium can be described in Quantum Chromodynamics (QCD) using perturbative techniques (pQCD), usually called hard probes. I discuss particle production at large transverse momentum and jets, heavy-quark and quarkonium production, and photons and dileptons. Finally, after a brief review of pA collisions, I end with a summary and a discussion about the potentiality of the measurements at the LHC - particularly those made during the first run - to further substantiate or, on the contrary, disproof the picture of the medium that has arisen from the confrontation between the SPS and RHIC data, and theoretical models.Comment: 64 pages, 40 figures, 7 tables; invited review for "Quark-Gluon Plasma 4"; v2: small changes, some predictions and references added, final versio

    Digital image correlation (DIC) analysis of the 3 December 2013 Montescaglioso landslide (Basilicata, Southern Italy). Results from a multi-dataset investigation

    Get PDF
    Image correlation remote sensing monitoring techniques are becoming key tools for providing effective qualitative and quantitative information suitable for natural hazard assessments, specifically for landslide investigation and monitoring. In recent years, these techniques have been successfully integrated and shown to be complementary and competitive with more standard remote sensing techniques, such as satellite or terrestrial Synthetic Aperture Radar interferometry. The objective of this article is to apply the proposed in-depth calibration and validation analysis, referred to as the Digital Image Correlation technique, to measure landslide displacement. The availability of a multi-dataset for the 3 December 2013 Montescaglioso landslide, characterized by different types of imagery, such as LANDSAT 8 OLI (Operational Land Imager) and TIRS (Thermal Infrared Sensor), high-resolution airborne optical orthophotos, Digital Terrain Models and COSMO-SkyMed Synthetic Aperture Radar, allows for the retrieval of the actual landslide displacement field at values ranging from a few meters (2–3 m in the north-eastern sector of the landslide) to 20–21 m (local peaks on the central body of the landslide). Furthermore, comprehensive sensitivity analyses and statistics-based processing approaches are used to identify the role of the background noise that affects the whole dataset. This noise has a directly proportional relationship to the different geometric and temporal resolutions of the processed imagery. Moreover, the accuracy of the environmental-instrumental background noise evaluation allowed the actual displacement measurements to be correctly calibrated and validated, thereby leading to a better definition of the threshold values of the maximum Digital Image Correlation sub-pixel accuracy and reliability (ranging from 1/10 to 8/10 pixel) for each processed dataset
    • …
    corecore