30 research outputs found

    Age Constraints on Brane Models of Dark Energy

    Get PDF
    Inspired by recent developments in particle physics, the so-called brane world cosmology seems to provide an alternative explanation for the present dark energy problem. In this paper, we use the estimated age of high-zz objects to constrain the value of the cosmological parameters in some particular scenarios based on this large scale modification of gravity. We show that such models are compatible with these observations for values of the crossover distance between the 4 and 5 dimensions of the order of rc1.67Ho1r_c \leq 1.67H_o^{-1}.Comment: 4 pages, 2 figures, 1 table, to appear in Phys. Rev.

    Some Observational Consequences of Brane World Cosmologies

    Get PDF
    The presence of dark energy in the Universe is inferred directly and indirectly from a large body of observational evidence. The simplest and most theoretically appealing possibility is the vacuum energy density (cosmological constant). However, although in agreement with current observations, such a possibility exacerbates the well known cosmological constant problem, requiring a natural explanation for its small, but nonzero, value. In this paper we focus our attention on another dark energy candidate, one arising from gravitational \emph{leakage} into extra dimensions. We investigate observational constraints from current measurements of angular size of high-zz compact radio-sources on accelerated models based on this large scale modification of gravity. The predicted age of the Universe in the context of these models is briefly discussed. We argue that future observations will enable a more accurate test of these cosmologies and, possibly, show that such models constitute a viable possibility for the dark energy problem.Comment: 6 pages, 4 figures, to appear in Phys. Rev. D (minor revisions

    Interacting models may be key to solve the cosmic coincidence problem

    Full text link
    It is argued that cosmological models that feature a flow of energy from dark energy to dark matter may solve the coincidence problem of late acceleration (i.e., "why the energy densities of both components are of the same order precisely today?"). However, much refined and abundant observational data of the redshift evolution of the Hubble factor are needed to ascertain whether they can do the job.Comment: 25 pages, 11 figures; accepted for publication in JCA

    Study of Z → llγ decays at √s = 8 TeV with the ATLAS detector

    Get PDF
    This paper presents a study of Z → llγ decays with the ATLAS detector at the Large Hadron Collider. The analysis uses a proton–proton data sample corresponding to an integrated luminosity of 20.2 fb−1 collected at a centre-ofmass energy √s = 8 TeV. Integrated fiducial cross-sections together with normalised differential fiducial cross-sections, sensitive to the kinematics of final-state QED radiation, are obtained. The results are found to be in agreement with stateof-the-art predictions for final-state QED radiation. First measurements of Z → llγ γ decays are also reported

    Constraints on spin-0 dark matter mediators and invisible Higgs decays using ATLAS 13 TeV pp collision data with two top quarks and missing transverse momentum in the final state

    Get PDF
    This paper presents a statistical combination of searches targeting final states with two top quarks and invisible particles, characterised by the presence of zero, one or two leptons, at least one jet originating from a b-quark and missing transverse momentum. The analyses are searches for phenomena beyond the Standard Model consistent with the direct production of dark matter in pp collisions at the LHC, using 139 fb−1 of data collected with the ATLAS detector at a centre-of-mass energy of 13 TeV. The results are interpreted in terms of simplified dark matter models with a spin-0 scalar or pseudoscalar mediator particle. In addition, the results are interpreted in terms of upper limits on the Higgs boson invisible branching ratio, where the Higgs boson is produced according to the Standard Model in association with a pair of top quarks. For scalar (pseudoscalar) dark matter models, with all couplings set to unity, the statistical combination extends the mass range excluded by the best of the individual channels by 50 (25) GeV, excluding mediator masses up to 370 GeV. In addition, the statistical combination improves the expected coupling exclusion reach by 14% (24%), assuming a scalar (pseudoscalar) mediator mass of 10 GeV. An upper limit on the Higgs boson invisible branching ratio of 0.38 (0.30+0.13−0.09) is observed (expected) at 95% confidence level

    Deep generative models for fast photon shower simulation in ATLAS

    Get PDF
    The need for large-scale production of highly accurate simulated event samples for the extensive physics programme of the ATLAS experiment at the Large Hadron Collider motivates the development of new simulation techniques. Building on the recent success of deep learning algorithms, variational autoencoders and generative adversarial networks are investigated for modelling the response of the central region of the ATLAS electromagnetic calorimeter to photons of various energies. The properties of synthesised showers are compared with showers from a full detector simulation using geant4. Both variational autoencoders and generative adversarial networks are capable of quickly simulating electromagnetic showers with correct total energies and stochasticity, though the modelling of some shower shape distributions requires more refinement. This feasibility study demonstrates the potential of using such algorithms for ATLAS fast calorimeter simulation in the future and shows a possible way to complement current simulation techniques

    Search for doubly charged Higgs boson production in multi-lepton final states using 139 fb−1 of proton–proton collisions at s√ = 13 TeV with the ATLAS detector

    Get PDF
    A search for pair production of doubly charged Higgs bosons (H±± ), each decaying into a pair of prompt, isolated, and highly energetic leptons with the same electric charge, is presented. The search uses a proton–proton collision data sample at a centre-of-mass energy of 13 TeV corresponding to an integrated luminosity of 139 fb−1 recorded by the ATLAS detector during Run 2 of the Large Hadron Collider (LHC). This analysis focuses on same-charge leptonic decays, H±±→ℓ±ℓ′± where ℓ,ℓ′=e,μ,τ, in two-, three-, and four-lepton channels, but only considers final states which include electrons or muons. No evidence of a signal is observed. Corresponding upper limits on the production cross-section of a doubly charged Higgs boson are derived, as a function of its mass m(H±±), at 95% confidence level. Assuming that the branching ratios to each of the possible leptonic final states are equal, B(H±±→e±e±)=B(H±±→e±μ±)=B(H±±→μ±μ±)=B(H±±→e±τ±)=B(H±±→μ±τ±)=B(H±±→τ±τ±)=1/6, the observed (expected) lower limit on the mass of a doubly charged Higgs boson is 1080 GeV (1065 GeV) within the left-right symmetric type-II seesaw model, which is the strongest limit to date produced by the ATLAS Collaboration. Additionally, this paper provides the first direct test of the Zee–Babu neutrino mass model at the LHC, yielding an observed (expected) lower limit of m(H±±) = 900 GeV (880 GeV)

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Software performance of the ATLAS track reconstruction for LHC run 3

    Get PDF
    Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pileup) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60 pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two
    corecore