692,096 research outputs found

    Three-Nucleon Force and the Δ\Delta-Mechanism for Pion Production and Pion Absorption

    Full text link
    The description of the three-nucleon system in terms of nucleon and Δ\Delta degrees of freedom is extended to allow for explicit pion production (absorption) from single dynamic Δ\Delta de-excitation (excitation) processes. This mechanism yields an energy dependent effective three-body hamiltonean. The Faddeev equations for the trinucleon bound state are solved with a force model that has already been tested in the two-nucleon system above pion-production threshold. The binding energy and other bound state properties are calculated. The contribution to the effective three-nucleon force arising from the pionic degrees of freedom is evaluated. The validity of previous coupled-channel calculations with explicit but stable Δ\Delta isobar components in the wavefunction is studied.Comment: 23 pages in Revtex 3.0, 9 figures (not included, available as postscript files upon request), CEBAF-TH-93-0

    Electroweak radiative corrections to single Higgs-boson production in e+e- annihilation

    Get PDF
    We have calculated the complete electroweak O(alpha) radiative corrections to the single Higgs-boson production processes e+ e- --> nu_l anti-nu_l H (l=e,mu,tau) in the electroweak Standard Model. Initial-state radiation beyond O(alpha) is included in the structure-function approach. The calculation of the corrections is briefly described, and numerical results are presented for the total cross section. In the G_mu scheme, the bulk of the corrections is due to initial-state radiation, which affects the cross section at the level of -7% at high energies and even more in the ZH threshold region. The remaining bosonic and fermionic corrections are at the level of a few per cent. The confusing situation in the literature regarding differing results for the fermionic corrections to this process is clarified.Comment: 11 pages, latex, 7 postscript files, some references added, final version to appear in Phys.Lett.

    QCD next-to-leading-order predictions matched to parton showers for vector-like quark models

    Full text link
    Vector-like quarks are featured by a wealth of beyond the Standard Model theories and are consequently an important goal of many LHC searches for new physics. Those searches, as well as most related phenomenological studies, however rely on predictions evaluated at the leading-order accuracy in QCD and consider well-defined simplified benchmark scenarios. Adopting an effective bottom-up approach, we compute next-to-leading-order predictions for vector-like-quark pair-production and single production in association with jets, with a weak or with a Higgs boson in a general new physics setup. We additionally compute vector-like-quark contributions to the production of a pair of Standard Model bosons at the same level of accuracy. For all processes under consideration, we focus both on total cross sections and on differential distributions, most these calculations being performed for the first time in our field. As a result, our work paves the way to precise extraction of experimental limits on vector-like quarks thanks to an accurate control of the shapes of the relevant observables and emphasize the extra handles that could be provided by novel vector-like-quark probes never envisaged so farComment: 21 pages, 12 figures, 6 tables; model files available from http://feynrules.irmp.ucl.ac.be/wiki/NLOModels; version accepted by EPJ

    Site-Net: Using global self-attention and real-space supercells to capture long-range interactions in crystal structures

    Full text link
    Site-Net is a transformer architecture that models the periodic crystal structures of inorganic materials as a labelled point set of atoms and relies entirely on global self-attention and geometric information to guide learning. Site-Net processes standard crystallographic information files to generate a large real-space supercell, and the importance of interactions between all atomic sites is flexibly learned by the model for the prediction task presented. The attention mechanism is probed to reveal Site-Net can learn long-range interactions in crystal structures, and that specific attention heads become specialized to deal with primarily short- or long-range interactions. We perform a preliminary hyperparameter search and train Site-Net using a single graphics processing unit (GPU), and show Site-Net achieves state-of-the-art performance on a standard band gap regression task.Comment: 23 pages, 13 figure

    Automatic phased mission system reliability model generation

    Get PDF
    There are many methods for modelling the reliability of systems based on component failure data. This task becomes more complex as systems increase in size, or undertake missions that comprise multiple discrete modes of operation, or phases. Existing techniques require certain levels of expertise in the model generation and calculation processes, meaning that risk and reliability assessments of systems can often be expensive and time-consuming. This is exacerbated as system complexity increases. This thesis presents a novel method which generates reliability models for phasedmission systems, based on Petri nets, from simple input files. The process has been automated with a piece of software designed for engineers with little or no experience in the field of risk and reliability. The software can generate models for both repairable and non-repairable systems, allowing redundant components and maintenance cycles to be included in the model. Further, the software includes a simulator for the generated models. This allows a user with simple input files to perform automatic model generation and simulation with a single piece of software, yielding detailed failure data on components, phases, missions and the overall system. A system can also be simulated across multiple consecutive missions. To assess performance, the software is compared with an analytical approach and found to match within 5% in both the repairable and non-repairable cases. The software documented in this thesis could serve as an aid to engineers designing new systems to validate the reliability of the system. This would not require specialist consultants or additional software, ensuring that the analysis provides results in a timely and cost-effective manner

    Estimating Diffractive Higgs Boson Production at LHC from HERA Data

    Get PDF
    Using a recently proposed factorization hypothesis for semi-inclusive hard processes in QCD, one can study, in principle, the diffractive production of the Standard Model Higgs boson at LHC using only, as input, epep diffractive hard-processes data of the type recently collected and analyzed by the H1 and ZEUS collaborations at HERA. While waiting for a more precise and complete set of data, we combine here the existing data with a simple Pomeron-exchange picture and find a large spread in the Higgs boson production cross section, depending on the input parametrization of the Pomeron's parton content. In particular, if the Pomeron gluon density fg/P(β)f_{g/P}(\beta) is peaked at large β\beta for small scales, single diffractive events will represent a sizeable fraction of all produced Higgs bosons with an expected better-than-average signal-to-background ratio.Comment: 12 pages (LATEX); figures are included via epsfig; the corresponding postscript files are uuencoded. A style file derived from the ``elsart.sty'' is included, as well as the ``elsart12.sty'' file. The AMSTEX fonts are required. See http://surya11.cern.ch/users/graudenz/publications.html for a complete postscript fil

    Business Process Risk Management and Simulation Modelling for Digital Audio-Visual Media Preservation.

    Get PDF
    Digitised and born-digital Audio-Visual (AV) content presents new challenges for preservation and Quality Assurance (QA) to ensure that cultural heritage is accessible for the long term. Digital archives have developed strategies for avoiding, mitigating and recovering from digital AV loss using IT-based systems, involving QA tools before ingesting files into the archive and utilising file-based replication to repair files that may be damaged while in the archive. However, while existing strategies are effective for addressing issues related to media degradation, issues such as format obsolescence and failures in processes and people pose significant risk to the long-term value of digital AV content. We present a Business Process Risk management framework (BPRisk) designed to support preservation experts in managing risks to long-term digital media preservation. This framework combines workflow and risk specification within a single risk management process designed to support continual improvement of workflows. A semantic model has been developed that allows the framework to incorporate expert knowledge from both preservation and security experts in order to intelligently aid workflow designers in creating and optimising workflows. The framework also provides workflow simulation functionality, allowing users to a) understand the key vulnerabilities in the workflows, b) target investments to address those vulnerabilities, and c) minimise the economic consequences of risks. The application of the BPRisk framework is demonstrated on a use case with the Austrian Broadcasting Corporation (ORF), discussing simulation results and an evaluation against the outcomes of executing the planned workflow

    BioMet Toolbox: genome-wide analysis of metabolism

    Get PDF
    The rapid progress of molecular biology tools for directed genetic modifications, accurate quantitative experimental approaches, high-throughput measurements, together with development of genome sequencing has made the foundation for a new area of metabolic engineering that is driven by metabolic models. Systematic analysis of biological processes by means of modelling and simulations has made the identification of metabolic networks and prediction of metabolic capabilities under different conditions possible. For facilitating such systemic analysis, we have developed the BioMet Toolbox, a web-based resource for stoichiometric analysis and for integration of transcriptome and interactome data, thereby exploiting the capabilities of genome-scale metabolic models. The BioMet Toolbox provides an effective user-friendly way to perform linear programming simulations towards maximized or minimized growth rates, substrate uptake rates and metabolic production rates by detecting relevant fluxes, simulate single and double gene deletions or detect metabolites around which major transcriptional changes are concentrated. These tools can be used for high-throughput in silico screening and allows fully standardized simulations. Model files for various model organisms (fungi and bacteria) are included. Overall, the BioMet Toolbox serves as a valuable resource for exploring the capabilities of these metabolic networks. BioMet Toolbox is freely available at www.sysbio.se/BioMet/
    • …
    corecore