2,520 research outputs found

    Glueball production in hadron and nucleus collisions

    Get PDF
    We elaborate on the hypothesis that in high energy hadron hadron and nucleus nucleus collisions the lowest mass glueballs are copiously produced from the gluon rich environment especially at high energy density. We discuss the particular glueball decay modes: 0++,2++→KKˉ0^{++}, 2^{++} \to K \bar{K} and 0++→π+π−ℓ+ℓ−0^{++} \to \pi^{+} \pi^{-} \ell^{+} \ell^{-}.Comment: 14 pages, six figure

    Conceptual uncertainties in modelling the interaction between engineered and natural barriers of nuclear waste repositories in crystalline rocks

    Get PDF
    Nuclear waste disposal in geological formations relies on a multi-barrier concept that includes engineered components – which, in many cases, include a bentonite buffer surrounding waste packages – and the host rock. Contrasts in materials, together with gradients across the interface between the engineered and natural barriers, lead to complex interactions between these two subsystems. Numerical modelling, combined with monitoring and testing data, can be used to improve our overall understanding of rock–bentonite interactions and to predict the performance of this coupled system. Although established methods exist to examine the prediction uncertainties due to uncertainties in the input parameters, the impact of conceptual model decisions on the quantitative and qualitative modelling results is more difficult to assess. A Swedish Nuclear Fuel and Waste Management Company Task Force project facilitated such an assessment. In this project, 11 teams used different conceptualizations and modelling tools to analyse the Bentonite Rock Interaction Experiment (BRIE) conducted at the Äspö Hard Rock Laboratory in Sweden. The exercise showed that prior system understanding along with the features implemented in the available simulators affect the processes included in the conceptual model. For some of these features, sufficient characterization data are available to obtain defensible results and interpretations, whereas others are less supported. The exercise also helped to identify the conceptual uncertainties that led to different assessments of the relative importance of the engineered and natural barrier subsystems. The range of predicted bentonite wetting times encompassed by the ensemble results were considerably larger than the ranges derived from individual models. This is a consequence of conceptual uncertainties, demonstrating the relevance of using a multi-model approach involving alternative conceptualizations.Peer ReviewedPostprint (author's final draft

    Control Performance Optimization for Application Integration on Automotive Architectures

    Get PDF
    Automotive software implements different functionalities as multiple control applications sharing common platform resources. Although such applications are often developed independently, the control performance of the resulting system depends on how these applications are integrated. A key integration challenge is to efficiently schedule these applications on shared resources with minimal control performance degradation. We formulate this problem as that of scheduling multiple distributed periodic control tasks that communicate via messages with non-zero jitter. The optimization criterion used is a piecewise linear representation of the control performance degradation as a function of the end-to-end latency of the application. The three main contributions of this article are: 1) a constraint programming (CP) formulation to solve this integration problem optimally on time-triggered architectures; 2) an efficient heuristic called Flexi ; and 3) an experimental evaluation of the scalability and efficiency of the proposed approaches. In contrast to the CP formulation, which for many real-life problems might have unacceptably long running times, Flexireturns nearly optimal results (0.5 percent loss in control performance compared to optimal) for most problems with more acceptable running times

    Conceptual uncertainties in modelling the interaction between engineered and natural barriers of nuclear waste repositories in crystalline rocks

    Get PDF
    Nuclear waste disposal in geological formations relies on a multi-barrier concept that includes engineered components – which, in many cases, include a bentonite buffer surrounding waste packages – and the host rock. Contrasts in materials, together with gradients across the interface between the engineered and natural barriers, lead to complex interactions between these two subsystems. Numerical modelling, combined with monitoring and testing data, can be used to improve our overall understanding of rock–bentonite interactions and to predict the performance of this coupled system. Although established methods exist to examine the prediction uncertainties due to uncertainties in the input parameters, the impact of conceptual model decisions on the quantitative and qualitative modelling results is more difficult to assess. A Swedish Nuclear Fuel and Waste Management Company Task Force project facilitated such an assessment. In this project, 11 teams used different conceptualizations and modelling tools to analyse the Bentonite Rock Interaction Experiment (BRIE) conducted at the Äspö Hard Rock Laboratory in Sweden. The exercise showed that prior system understanding along with the features implemented in the available simulators affect the processes included in the conceptual model. For some of these features, sufficient characterization data are available to obtain defensible results and interpretations, whereas others are less supported. The exercise also helped to identify the conceptual uncertainties that led to different assessments of the relative importance of the engineered and natural barrier subsystems. The range of predicted bentonite wetting times encompassed by the ensemble results were considerably larger than the ranges derived from individual models. This is a consequence of conceptual uncertainties, demonstrating the relevance of using a multi-model approach involving alternative conceptualizations

    QCD corrections to the t-->H+b decay within the minimal supersymmetric standard model

    Full text link
    I present the contribution of gluinos and scalar quarks to the decay rate of the top quark into a charged Higgs boson and a bottom quark within the minimal supersymmetric standard model, including the mixing of the scalar partners of the left- and right-handed top quark. I show that for certain values of the supersymmetric parameters the standard QCD loop corrections to this decay mode are diminished or enhanced by several 10 per cent. I show that not only a small value of 3 GeV for the gluino mass (small mass window) but also much larger values of several hundreds of GeV's have a non-neglible effect on this decay rate, against general belief. Last but not least, if the ratio of the vacuum expectation values of the Higgs bosons are taken in the limit of v1â‰Șv2v_1\ll v_2 I obtain a drastic enhancement due to a tan⁥ÎČ\tan\beta\ dependence in the couplings.Comment: UQAM-PHE-94/01, 6 pages, plain tex, 4 figures not included, available under request via mail or fa

    Transverse Momentum Spectra of Pions in Particle and Nuclear Collisions and Some Ratio-Behaviours: Towards A Combinational Approach

    Full text link
    The nature of transverse momentum dependence of the inclusive cross-sections for secondary pions produced in high energy hadronic(PPPP), hadronuclear(PAPA) and nuclear(AAAA) collisions has here been exhaustively investigated for a varied range of interactions in a unified way with the help of a master formula. This formula evolved from a new combination of the basic Hagedorn's model for particle(pion) production in PP scattering at ISR range of energies, a phenomenological approach proposed by Peitzmann for converting the results of NN(PP)NN(PP) reactions to those for either PAPA or AAAA collisions, and a specific form of parametrization for mass number-dependence of the nuclear cross sections. This grand combination of models(GCM) is then applied to analyse the assorted extensive data on various high energy collisions. The nature of qualitative agreement between measurements and calculations on both the inclusive cross-sections for production of pions, and some ratios of them as well, is quite satisfactory. The modest successes that we achieve here in dealing with the massive data-sets are somewhat encouraging in view of the diversity of the reactions and the very wide range of interaction energies.Comment: 19 pages, 19 figure

    Estimating the Energy Consumption of Applications in the Computing Continuum with <i>iFogSim</i>

    Get PDF
    Digital services - applications that often span the entire computing continuum - have become an essential part of our daily lives, but they can have a significant energy cost, raising sustainability concerns. The computing continuum features multiple distributed layers (edge, fog, and cloud) with specific computing infrastructure and scheduling decisions at each layer, which impact the overall quality of service and energy consumption of digital services. Measuring the energy consumption of such applications is challenging due to the distributed nature of the system and the application. As such, simulation techniques are promising solutions to estimate energy consumption, and several simulators are available for modeling the cloud and fog computing environment.In this paper, we investigate iFogSim’s effectiveness in analyzing the end-to-end energy consumption of applications in the computing continuum through two case studies. We design different scenarios for each case study to map application modules to devices along the continuum, including the Edge-Cloud collaboration architecture, and compare them with the two placement policies native to iFogSim: Cloud-only and Edge-ward policies. We observe iFogSim’s limitations in reporting energy consumption, and improve its ability to report energy consumption from an application’s perspective; this enables additional insight into an application’s energy consumption, thus enhancing the usability of iFogSim in evaluating the end-to-end energy consumption of digital services.</p
    • 

    corecore