207 research outputs found

    Neurocognitive Mechanisms of Statistical-Sequential Learning: what do Event-Related Potentials Tell Us?

    Get PDF
    Statistical-sequential learning (SL) is the ability to process patterns of environmental stimuli, such as spoken language, music, or one’s motor actions, that unfold in time. The underlying neurocognitive mechanisms of SL and the associated cognitive representations are still not well understood as reflected by the heterogeneity of the reviewed cognitive models. The purpose of this review is: (1) to provide a general overview of the primary models and theories of SL, (2) to describe the empirical research – with a focus on the event- related potential (ERP) literature – in support of these models while also highlighting the current limitations of this research, and (3) to present a set of new lines of ERP research to overcome these limitations. The review is articulated around three descriptive dimensions in relation to SL: the level of abstractness of the representations learned through SL, the effect of the level of attention and consciousness on SL, and the developmental trajectory of SL across the life-span. We conclude with a new tentative model that takes into account these three dimensions and also point to several promising new lines of SL research

    Measurement of Triple-Differential Z+Jet Cross Sections with the CMS Detector at 13 TeV and Modelling of Large-Scale Distributed Computing Systems

    Get PDF
    The achievable precision in the calculations of predictions for observables measured at the LHC experiments depends on the amount of invested computing power and the precision of input parameters that go into the calculation. Currently, no theory exists that can derive the input parameter values for perturbative calculations from first principles. Instead, they have to be derived from measurements in dedicated analyses that measure observables sensitive to the input parameters with high precision. Such an analysis that measures the production cross section of oppositely charged muon pairs with an invariant mass close to the mass of the Z\mathrm{Z} boson in association with jets in a phase space divided into bins of the transverse momentum of the dimuon system pTZp_T^\text{Z}, and two observables y∗y^* and yby_b created from the rapidities of the dimuon system and the jet with the highest momentum is presented. To achieve the highest statistical precision in this triple-differential measurement the full data recorded by the CMS experiment at a center-of-mass energy of s=13 TeV\sqrt{s}=13\,\mathrm{TeV} in the years 2016 to 2018 is combined. The measured cross sections are compared to theoretical predictions approximating full NNLO accuracy in perturbative QCD. Deviations from these predictions are observed rendering further studies at full NNLO accuracy necessary. To obtain the measured results large amounts of data are processed and analysed on distributed computing infrastructures. Theoretical calculations pose similar computing demands. Consequently, substantial amounts of storage and processing resources are required by the LHC collaborations. These requirements are met in large parts by the resources of the WLCG, a complex federation of globally distributed computer centres. With the upgrade of the LHC and the experiments, in the HL-LHC era, the computing demands are expected to increase substantially. Therefore, the prevailing computing models need to be updated to cope with the unprecedented demands. For the design of future adaptions of the HEP workflow executions on infrastructures a simulation model is developed, and an implementation tested on infrastructure design candidates inspired by a proposal of the German HEP computing community. The presented study of these infrastructure candidates showcases the applicability of the simulation tool in the strategical development of a future computing infrastructure for HEP in the HL-LHC context
    • …
    corecore