Open Access Repository
Not a member yet
    6048 research outputs found

    Measurement of multidifferential cross sections for dijet production in proton–proton collisions at s=13TeV\sqrt{s} = 13\,\text {Te}\hspace{-.08em}\text {V}

    No full text
    A measurement of the dijet production cross section is reported based on proton–proton collision data collected in 2016 at s=13TeV\sqrt{s}=13\,\text {Te}\hspace{-.08em}\text {V} by the CMS experiment at the CERN LHC, corresponding to an integrated luminosity of up to 36.3fb1\,\text {fb}^{-1}. Jets are reconstructed with the anti-kTk_{\textrm{T}} algorithm for distance parameters of R=0.4R=0.4 and 0.8. Cross sections are measured double-differentially (2D) as a function of the largest absolute rapidity ymax|y |_{\text {max}} of the two jets with the highest transverse momenta pTp_{\textrm{T}} and their invariant mass m1,2m_{1,2} , and triple-differentially (3D) as a function of the rapidity separation yy^{*} , the total boost yby_{\text {b}} , and either m1,2m_{1,2} or the average pTp_{\textrm{T}} of the two jets. The cross sections are unfolded to correct for detector effects and are compared with fixed-order calculations derived at next-to-next-to-leading order in perturbative quantum chromodynamics. The impact of the measurements on the parton distribution functions and the strong coupling constant at the mass of the Z{\text {Z}} boson is investigated, yielding a value of αS(mZ)=0.1179±0.0019\alpha _\textrm{S} (m_{{\text {Z}}}) =0.1179\pm 0.0019

    The Principle of Maximum Conformality Correctly Resolves the Renormalization-Scheme-Dependence Problem

    No full text
    In this paper, we clarify a serious misinterpretation and consequent misuse of the Principle of Maximum Conformality (PMC), which also can serve as a mini-review of PMC. In a recently published article, P. M. Stevenson has claimed that "the PMC is ineffective and does nothing to resolve the renormalization-scheme-dependence problem", concluding incorrectly that the success of PMC predictions is due to the PMC being a "laborious, ad hoc, and back-door" version of the Principle of Minimal Sensitivity (PMS). We show that such conclusions are incorrect, deriving from a misinterpretation of the PMC and an overestimation of the applicability of the PMS. The purpose of the PMC is to achieve precise fixed-order pQCD predictions, free from conventional renormalization schemes and scale ambiguities. We demonstrate that the PMC predictions satisfy all the self-consistency conditions of the renormalization group and standard renormalization-group invariance; the PMC predictions are thus independent of any initial choice of renormalization scheme and scale. The scheme independence of the PMC is also ensured by commensurate scale relations, which relate different observables to each other. Moreover, in the Abelian limit, the PMC dovetails into the well-known Gell-Mann–Low framework, a method universally revered for its precision in QED calculations. Due to the elimination of factorially divergent renormalon terms, the PMC series not only attains a convergence behavior far superior to that of its conventional counterparts but also deftly curtails any residual scale dependence caused by the unknown higher-order terms. This refined convergence, coupled with its robust suppression of residual uncertainties, furnishes a sound and reliable foundation for estimating the contributions from unknown higher-order terms. Anchored in the bedrock of standard renormalization-group invariance, the PMC simultaneously eradicates the factorial divergences and eliminates superfluous systematic errors, which inversely provides a good foundation for achieving high-precision pQCD predictions. Consequently, owing to its rigorous theoretical underpinnings, the PMC is eminently applicable to virtually all high-energy hadronic processes

    Burnt-Areas-Italian-Terrestrial-Ecosystem (2018-2024)

    No full text
    <p>Il prodotto di mappatura BA-ITE descrive le aree percorse da incendio identificate da EFFIS nel corso dell’anno solare (1° gennaio – 31 dicembre) sul territorio nazionale.<br>Le coperture forestali di riferimento derivano dal modello ECM-F4_2020 (https://groupware.sinanet.isprambiente.it/prodotti-operativi-di-sorveglianza-ambientale/library/ecosystems-classification-model), classificate secondo la nomenclatura EUNIS (2021) nei principali tipi forestali (T1 latifoglie decidue, T2 sempreverdi, T3 conifere, T34 temperate sub-alpine, TNC non classificate).<br>I dati spaziali sono forniti in formato vettoriale e riproiettati in un sistema di coordinate metrico decimale uniforme.<br>L’intersezione tra poligoni EFFIS e classi forestali ECM-F4 è stata analizzata con strumenti GIS, mentre le elaborazioni geostatistiche sono state realizzate in Python mediante GeoPandas e Rasterstats.</p&gt

    Note Illustrative della Carta geologica d'Italia alla scala 1:50.000, F. 493 Taranto

    No full text
    <p>Note illustrative redatte per il Foglio geologico n. 493 Taranto della Carta Geologica d'Italia alla scala 1:50.000. 196 pp.</p&gt

    Determining probability density functions with adiabatic quantum computing

    No full text
    The two main approaches to quantum computing are gate-based computation and analog computation, which are polynomially equivalent in terms of complexity, and they are often seen as alternatives to each other. In this work, we present a method for fitting one-dimensional probability distributions as a practical example of how analog and gate-based computation can be used together to perform different tasks within a single algorithm. In particular, we propose a strategy for encoding data within an adiabatic evolution model, which accommodates the fitting of strictly monotonic functions, as it is the cumulative distribution function of a dataset. Subsequently, we use a Trotter-bounded procedure to translate the adiabatic evolution into a quantum circuit in which the evolution time t is identified with the parameters of the circuit. This facilitates computing the probability density as derivative of the cumulative function using parameter shift rules

    Il MUR per la Scienza Aperta

    Get PDF

    CoARA a che punto siamo?

    Get PDF

    Supporting the development of Machine Learning for fundamental science in a federated Cloud with the AI_INFN platform

    No full text
    <p>Machine Learning (ML) is driving a revolution in the way scientists design, develop, and deploy data-intensive software. However, the adoption of ML presents new challenges for the computing infrastructure, particularly in terms of provisioning and orchestrating access to hardware accelerators for development, testing, and production. The INFN-funded project AI_INFN ("Artificial Intelligence at INFN") aims at fostering the adoption of ML techniques within INFN use cases by providing support on multiple aspects, including the provision of AI-tailored computing resources. It leverages cloud-native solutions in the context of INFN Cloud, to share hardware accelerators as effectively as possible, ensuring the diversity of the Institute's research activities is not compromised. In this contribution, we provide an update on the commissioning of a Kubernetes platform designed to ease the development of GPU-powered data analysis workflows and their scalability on heterogeneous, distributed computing resources, possibly federated as Virtual Kubelets with the interLink provider.</p&gt

    Proprietà dei layer (vettoriali e raster) - Esercitazione

    Get PDF
    <p>The present exercises (in Italian language) are related to the lesson about layers properties, both raster and vector, in QGIS, in the frame of the annual course of QGIS (base level) held by ISPRA in the frame of its SNPA activities.</p&gt

    Extending Cosmic Ray Background in Space Experiments using Generative Adversarial Networks

    Get PDF
    <p>Cosmic rays (CR) reaching telescope detectors in outer space are known<br>\nto induce glitches and background noise. The presence of CR noise significantly<br>\ninfluenced Cosmic Microwave Background (CMB) experiments, like Planck and<br>\nLiteBIRD, which have a long exposition and hard shelling or filtering. In order to<br>\naddress this challenge, it is imperative to accurately simulate the CR background<br>\nthroughout the duration of LiteBIRD’s three-year mission. However, state-of-the-art<br>\nMonte Carlo (MC) simulations are extremely computational expensive, typically<br>\nrequiring 30 times the simulated period. We present the Cosmic Rays Artificial<br>\nBackground (CRAB) code, extending MC simulations with Generative Adversarial<br>\nNetworks (GAN). By leveraging GANs, we can efficiently generate a sufficient<br>\nnumber of genuine, statistically independent images, unlike traditional noise analysis<br>\ntechniques combined with template expansion methods.</p&gt

    719

    full texts

    6,048

    metadata records
    Updated in last 30 days.
    Open Access Repository is based in Italy
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇