91 research outputs found

    2D characterization of near-surface V P/V S: surface-wave dispersion inversion versus refraction tomography

    No full text
    International audienceThe joint study of pressure (P-) and shear (S-) wave velocities (Vp and Vs ), as well as their ratio (Vp /Vs), has been used for many years at large scales but remains marginal in near-surface applications. For these applications, and are generally retrieved with seismic refraction tomography combining P and SH (shear-horizontal) waves, thus requiring two separate acquisitions. Surface-wave prospecting methods are proposed here as an alternative to SH-wave tomography in order to retrieve pseudo-2D Vs sections from typical P-wave shot gathers and assess the applicability of combined P-wave refraction tomography and surface-wave dispersion analysis to estimate Vp/Vs ratio. We carried out a simultaneous P- and surface-wave survey on a well-characterized granite-micaschists contact at Ploemeur hydrological observatory (France), supplemented with an SH-wave acquisition along the same line in order to compare Vs results obtained from SH-wave refraction tomography and surface-wave profiling. Travel-time tomography was performed with P- and SH- wave first arrivals observed along the line to retrieve Vtomo p and Vtomo s models. Windowing and stacking techniques were then used to extract evenly spaced dispersion data from P-wave shot gathers along the line. Successive 1D Monte Carlo inversions of these dispersion data were performed using fixed Vp values extracted from Vtomo p the model and no lateral constraints between two adjacent 1D inversions. The resulting 1D Vsw s models were then assembled to create a pseudo-2D Vsw s section, which appears to be correctly matching the general features observed on the section. If the pseudo-section is characterized by strong velocity incertainties in the deepest layers, it provides a more detailed description of the lateral variations in the shallow layers. Theoretical dispersion curves were also computed along the line with both and models. While the dispersion curves computed from models provide results consistent with the coherent maxima observed on dispersion images, dispersion curves computed from models are generally not fitting the observed propagation modes at low frequency. Surface-wave analysis could therefore improve models both in terms of reliability and ability to describe lateral variations. Finally, we were able to compute / sections from both and models. The two sections present similar features, but the section obtained from shows a higher lateral resolution and is consistent with the features observed on electrical resistivity tomography, thus validating our approach for retrieving Vp/Vs ratio from combined P-wave tomography and surface-wave profiling

    Helium identification with LHCb

    Get PDF
    The identification of helium nuclei at LHCb is achieved using a method based on measurements of ionisation losses in the silicon sensors and timing measurements in the Outer Tracker drift tubes. The background from photon conversions is reduced using the RICH detectors and an isolation requirement. The method is developed using pp collision data at √(s) = 13 TeV recorded by the LHCb experiment in the years 2016 to 2018, corresponding to an integrated luminosity of 5.5 fb-1. A total of around 105 helium and antihelium candidates are identified with negligible background contamination. The helium identification efficiency is estimated to be approximately 50% with a corresponding background rejection rate of up to O(10^12). These results demonstrate the feasibility of a rich programme of measurements of QCD and astrophysics interest involving light nuclei

    Momentum scale calibration of the LHCb spectrometer

    Get PDF
    For accurate determination of particle masses accurate knowledge of the momentum scale of the detectors is crucial. The procedure used to calibrate the momentum scale of the LHCb spectrometer is described and illustrated using the performance obtained with an integrated luminosity of 1.6 fb-1 collected during 2016 in pp running. The procedure uses large samples of J/ψ → ÎŒ + ÎŒ - and B+ → J/ψ K + decays and leads to a relative accuracy of 3 × 10-4 on the momentum scale

    Curvature-bias corrections using a pseudomass method

    Get PDF
    Momentum measurements for very high momentum charged particles, such as muons from electroweak vector boson decays, are particularly susceptible to charge-dependent curvature biases that arise from misalignments of tracking detectors. Low momentum charged particles used in alignment procedures have limited sensitivity to coherent displacements of such detectors, and therefore are unable to fully constrain these misalignments to the precision necessary for studies of electroweak physics. Additional approaches are therefore required to understand and correct for these effects. In this paper the curvature biases present at the LHCb detector are studied using the pseudomass method in proton-proton collision data recorded at centre of mass energy √(s)=13 TeV during 2016, 2017 and 2018. The biases are determined using Z→Ό + ÎŒ - decays in intervals defined by the data-taking period, magnet polarity and muon direction. Correcting for these biases, which are typically at the 10-4 GeV-1 level, improves the Z→Ό + ÎŒ - mass resolution by roughly 18% and eliminates several pathological trends in the kinematic-dependence of the mean dimuon invariant mass

    Study of the doubly charmed tetraquark T+cc

    Get PDF
    Quantum chromodynamics, the theory of the strong force, describes interactions of coloured quarks and gluons and the formation of hadronic matter. Conventional hadronic matter consists of baryons and mesons made of three quarks and quark-antiquark pairs, respectively. Particles with an alternative quark content are known as exotic states. Here a study is reported of an exotic narrow state in the D0D0π+ mass spectrum just below the D*+D0 mass threshold produced in proton-proton collisions collected with the LHCb detector at the Large Hadron Collider. The state is consistent with the ground isoscalar T+cc tetraquark with a quark content of ccu⎯⎯⎯d⎯⎯⎯ and spin-parity quantum numbers JP = 1+. Study of the DD mass spectra disfavours interpretation of the resonance as the isovector state. The decay structure via intermediate off-shell D*+ mesons is consistent with the observed D0π+ mass distribution. To analyse the mass of the resonance and its coupling to the D*D system, a dedicated model is developed under the assumption of an isoscalar axial-vector T+cc state decaying to the D*D channel. Using this model, resonance parameters including the pole position, scattering length, effective range and compositeness are determined to reveal important information about the nature of the T+cc state. In addition, an unexpected dependence of the production rate on track multiplicity is observed

    The LHCb upgrade I

    Get PDF
    The LHCb upgrade represents a major change of the experiment. The detectors have been almost completely renewed to allow running at an instantaneous luminosity five times larger than that of the previous running periods. Readout of all detectors into an all-software trigger is central to the new design, facilitating the reconstruction of events at the maximum LHC interaction rate, and their selection in real time. The experiment's tracking system has been completely upgraded with a new pixel vertex detector, a silicon tracker upstream of the dipole magnet and three scintillating fibre tracking stations downstream of the magnet. The whole photon detection system of the RICH detectors has been renewed and the readout electronics of the calorimeter and muon systems have been fully overhauled. The first stage of the all-software trigger is implemented on a GPU farm. The output of the trigger provides a combination of totally reconstructed physics objects, such as tracks and vertices, ready for final analysis, and of entire events which need further offline reprocessing. This scheme required a complete revision of the computing model and rewriting of the experiment's software
    • 

    corecore