13,139 research outputs found

    On the non-minimal character of the SMEFT

    Get PDF
    When integrating out unknown new physics sectors, what is the minimal character of the Standard Model Effective Field Theory (SMEFT) that can result? In this paper we focus on a particular aspect of this question: "How can one obtain only one dimension six operator in the SMEFT from a consistent tree level matching onto an unknown new physics sector?" We show why this requires conditions on the ultraviolet field content that do not indicate a stand alone ultraviolet complete scenario. Further, we demonstrate how a dynamical origin of the ultraviolet scales assumed to exist in order to generate the masses of the heavy states integrated out generically induces more operators. Therefore, our analysis indicates that the infrared limit captured from a new sector in consistent matchings induces multiple operators in the SMEFT quite generically. Global data analyses in the SMEFT can and should accommodate this fact.Comment: 11pp, 2 fig. V2: PLB version and minor typos correcte

    Spiking Neural P Systems with Addition/Subtraction Computing on Synapses

    Get PDF
    Spiking neural P systems (SN P systems, for short) are a class of distributed and parallel computing models inspired from biological spiking neurons. In this paper, we introduce a variant called SN P systems with addition/subtraction computing on synapses (CSSN P systems). CSSN P systems are inspired and motivated by the shunting inhibition of biological synapses, while incorporating ideas from dynamic graphs and networks. We consider addition and subtraction operations on synapses, and prove that CSSN P systems are computationally universal as number generators, under a normal form (i.e. a simplifying set of restrictions)

    The SMEFTsim package, theory and tools

    Get PDF
    We report codes for the Standard Model Effective Field Theory (SMEFT) in FeynRules -- the SMEFTsim package. The codes enable theoretical predictions for dimension six operator corrections to the Standard Model using numerical tools, where predictions can be made based on either the electroweak input parameter set {α^ew,m^Z,G^F}\{\hat{\alpha}_{ew}, \hat{m}_Z, \hat{G}_F \} or {m^W,m^Z,G^F}\{\hat{m}_{W}, \hat{m}_Z, \hat{G}_F\}. All of the baryon and lepton number conserving operators present in the SMEFT dimension six Lagrangian, defined in the Warsaw basis, are included. A flavour symmetric U(3)5{\rm U}(3)^5 version with possible non-SM CP\rm CP violating phases, a (linear) minimal flavour violating version neglecting such phases, and the fully general flavour case are each implemented. The SMEFTsim package allows global constraints to be determined on the full Wilson coefficient space of the SMEFT. As the number of parameters present is large, it is important to develop global analyses on reduced sets of parameters minimizing any UV assumptions and relying on IR kinematics of scattering events and symmetries. We simultaneously develop the theoretical framework of a "W-Higgs-Z pole parameter" physics program that can be pursued at the LHC using this approach and the SMEFTsim package. We illustrate this methodology with several numerical examples interfacing SMEFTsim with MadGraph5. The SMEFTsim package can be downloaded at https://feynrules.irmp.ucl.ac.be/wiki/SMEFTComment: Corrected numerics of section 10.5.1, references added, minor changes and corrected typos. Version published in JHE

    Multidimensional Membership Mixture Models

    Full text link
    We present the multidimensional membership mixture (M3) models where every dimension of the membership represents an independent mixture model and each data point is generated from the selected mixture components jointly. This is helpful when the data has a certain shared structure. For example, three unique means and three unique variances can effectively form a Gaussian mixture model with nine components, while requiring only six parameters to fully describe it. In this paper, we present three instantiations of M3 models (together with the learning and inference algorithms): infinite, finite, and hybrid, depending on whether the number of mixtures is fixed or not. They are built upon Dirichlet process mixture models, latent Dirichlet allocation, and a combination respectively. We then consider two applications: topic modeling and learning 3D object arrangements. Our experiments show that our M3 models achieve better performance using fewer topics than many classic topic models. We also observe that topics from the different dimensions of M3 models are meaningful and orthogonal to each other.Comment: 9 pages, 7 figure
    • …
    corecore