94 research outputs found

    Tracing uncertainty contributors in the multi‐hazard risk analysis for compound extremes

    Get PDF
    In this study, an iterative factorial multimodel Bayesian copula (IFMBC) framework was developed for revealing uncertainties in risk inferences of compound extremes under the consideration of diverse model structures and parameter sets. Particularly, an iterative factorial analysis (IFA) method would be advanced in IFMBC to track the dominant contributors to the imprecise predictions of multi-hazard risks. The developed IFMBC framework was applied for the risk assessment of compound floods at two estuarine systems (i.e., Washington and Philadelphia) in US. The results indicate that the most likely compound events, under predefined return periods, exhibit noticeable uncertainties. Those uncertainties also present multiple hotspots which may be attributed to different impacts from different factors. By applying the IFA method, the results suggest the copula structure would likely be ranked as one of the top 2 impact factors for predictions of failure probabilities (FPs) in the scenarios of AND, and Kendall, with its contributions higher than 30% for FP in Kendall (more than 40% at Washington) and more than 25% for FP in Kendall (larger than 40% at Philadelphia). In comparison, the copula structure may not pose a visible effect on the predictive uncertainty for FP in OR, with its contribution possibly less than 5% under long-term service time periods. However, the marginal distributions would have higher effects on FP in OR than the effects on the other two FPs. Particularly, the marginal distribution for the extreme variable with high skewness and kurtosis values tends to be ranked as one of the most significant impact factors for FP in OR. Also, the overall impacts from parametric uncertainties in both marginal and dependence models cannot be neglected for the predictions of all three failure probabilities (FPs) with their contributions probably larger than 20% under a short service time period. Compared with the traditional multilevel factorial analysis, the IFA method can provide more reliable characterization for uncertainty contributors in multi-hazard risk analyses, since the traditional method seems to significantly overestimate the contributions from parameter uncertainties

    Characteristics of damaged asphalt mixtures in tension and compression

    Get PDF
    This paper addresses the measurement and modelling of the damaged properties of asphalt mixtures including the fracture, healing and viscoplastic deformation of the asphalt mixtures in both tensile and compressive loading as being affected by their composition and conditioning with ageing and exposure to temperature and moisture. An energy-based mechanics is applied to obtain the material fundamental properties such as surface energies, bond energies, anisotropy, yield functions and plastic potential functions that are valid for actual asphalt mixtures, viscoelastic crack growth criteria under both tensile and compressive loading, a simple mechanics-based method of determining the fatigue endurance limit, and the measurement and prediction of healing in restoring the damage done by fracture. Healing is anti-fracture and cracking is the net result of the interplay of these two complimentary mechanisms. Because fracture in asphalt mixtures is not the growth of a single crack but the simultaneous growth of multiple cracks that start out as air voids, this fact leads to the use of the growth of damage density to characterise fracture in an asphalt mixture. It was discovered that the form of Paris’ law applies to the growth of damage density of asphalt mixtures in both tensile and compressive loadings. The importance of this fact lies in many developments from this discovery, e.g. compressive monotonic loading of cylindrical test samples permits a direct determination of the Paris’ Law coefficient and exponent. In all cases, measured material properties are presented as they vary with mixture composition and with conditioning such as moisture and ageing, both in the lab and in the field. The measurements of these properties are made simply, quickly and accurately by the use of mechanics so that an entire characterisation of the properties of an asphalt mixture in tension and compression can be completed in the space of one day. The net effect is to reduce the efforts expended in the lab and the systematic error due to the assumptions made by the existing models and simultaneously to increase the efficiency and cost-effectiveness of materials testing and raise the reliability of the design of mixtures, pavement structures and specifications and the prediction of the life cycles in as-built pavements

    Observation of quantum entanglement with top quarks at the ATLAS detector

    Get PDF
    Entanglement is a key feature of quantum mechanics with applications in fields such as metrology, cryptography, quantum information and quantum computation. It has been observed in a wide variety of systems and length scales, ranging from the microscopic to the macroscopic. However, entanglement remains largely unexplored at the highest accessible energy scales. Here we report the highest-energy observation of entanglement, in top–antitop quark events produced at the Large Hadron Collider, using a proton–proton collision dataset with a centre-of-mass energy of √s = 13 TeV and an integrated luminosity of 140 inverse femtobarns (fb)−1 recorded with the ATLAS experiment. Spin entanglement is detected from the measurement of a single observable D, inferred from the angle between the charged leptons in their parent top- and antitop-quark rest frames. The observable is measured in a narrow interval around the top–antitop quark production threshold, at which the entanglement detection is expected to be significant. It is reported in a fiducial phase space defined with stable particles to minimize the uncertainties that stem from the limitations of the Monte Carlo event generators and the parton shower model in modelling top-quark pair production. The entanglement marker is measured to be D = −0.537 ± 0.002 (stat.) ± 0.019 (syst.) for 340 GeV < mtt < 380 GeV. The observed result is more than five standard deviations from a scenario without entanglement and hence constitutes the first observation of entanglement in a pair of quarks and the highest-energy observation of entanglement so far

    Precise measurements of W- and Z-boson transverse momentum spectra with the ATLAS detector using pp collisions at t √s = 5.02 TeV and 13 TeV

    Get PDF

    Measurements of the production cross-section for a Z boson in association with b- or c-jets in proton–proton collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    This paper presents a measurement of the production cross-section of a Z boson in association with bor c-jets, in proton–proton collisions at √s = 13 TeV with the ATLAS experiment at the Large Hadron Collider using data corresponding to an integrated luminosity of 140 fb−1. Inclusive and differential cross-sections are measured for events containing a Z boson decaying into electrons or muons and produced in association with at least one b-jet, at least one c-jet, or at least two b-jets with transverse momentum pT > 20 GeV and rapidity |y| < 2.5. Predictions from several Monte Carlo generators based on next-to-leading-order matrix elements interfaced with a parton-shower simulation, with different choices of flavour schemes for initial-state partons, are compared with the measured cross-sections. The results are also compared with novel predictions, based on infrared and collinear safe jet flavour dressing algorithms. Selected Z+ ≄ 1 c-jet observables, optimized for sensitivity to intrinsic-charm, are compared with benchmark models with different intrinsic-charm fractions

    Software performance of the ATLAS track reconstruction for LHC run 3

    Get PDF
    Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pileup) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60 pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two
    • 

    corecore