12,389 research outputs found

    From sine-Gordon to vacuumless systems in flat and curved spacetimes

    Full text link
    In this work we start from the Higgs prototype model to introduce a new model, which makes a smooth transition between systems with well located minima and systems that support no minima at all. We implement this possibility using the deformation procedure, which allows the obtention of a sine-Gordon-like model, controlled by a real parameter that gives rise to a family of models, reproducing the sine-Gordon and the so-called vacuumless models. We also study the thick brane scenarios associated with these models and investigate their stability and renormalization group flow. In particular, one shows how gravity can change from the 5-dimensional warped geometry with a single extra dimension of infinite extent to the conventional 5-dimensional Minkowski geometry.Comment: 11 pages, 12 figures. Version to appear in EPJ

    Production of exotic charmonium in γγ\gamma \gamma interactions at hadronic colliders

    Get PDF
    In this paper we investigate the Exotic Charmonium (EC) production in γγ\gamma \gamma interactions present in proton-proton, proton-nucleus and nucleus-nucleus collisions at the CERN Large Hadron Collider (LHC) energies as well as for the proposed energies of the Future Circular Collider (FCC). Our results demonstrate that the experimental study of these processes is feasible and can be used to constrain the theoretical decay widths and shed some light on the configuration of the considered multiquark states.Comment: 7 pages, 2 figures, 3 tables. v2: Revised version published in Physical Review

    Use of Machine Learning Models of the ”Transformers” Type in the Construction of Services in a Gamified Web app.

    Get PDF
    The purpose of this document is to describe the use of a natural language processing model in the multiplatform system ”Gamivity” by means of a sentence similarity algorithm to offer a personalized experience module based on the conceptual relationship between questions. For the selection process, certain criteria were chosen that will allow several pre-trained models under the “Transformers” architecture for evaluation, later. These criteria were the language with which the model was altered; Python was the programming language used for the implementation. Regarding the evaluation phase of the selected models, the ”Sentence Transformers” library of the Python programming language was used. In addition, a work environment analogous to the module present in the ”Gamivity” system was built, in which the development platform ”Google Colab” was used to test these models. The criteria for choosing the candidate model were based on its effectiveness in relation to questions as well as the computational cost involved while performing the operations in the said model Based on the applied methodology, the model that yielded the best results was ”paraphrase-multilingual- MiniLM-L12-v2,” modified with a large corpus of text in Spanish and 50 other languages, which showed a degree of precision. When it comes to conceptually relating the questions provided it was found to be optimal, having relatively low computational cost when performing these operations. Keywords: sentence transformers, sentence similarity, relate questions, personalized learning. Resumen El presente documento, tiene como propósito el de describir la utilización de un modelo de procesamiento de lenguaje natural en el sistema multiplataforma “Gamivity”, por medio de un algoritmo de similitud de oraciones para ofrecer un módulo de experiencia personalizada a partir de la relación conceptual entre preguntas. Para el proceso de selección, se establecieron ciertos criterios que permitieron elegir varios modelos pre entrenados bajo la arquitectura “Transformers” para su posterior evaluación. Dichos criterios, fueron el idioma con el que fue entrenado el modelo, así como que el lenguaje de programación utilizado para la implementación fuese Python. En lo que concierne a la fase de evaluación de los modelos seleccionados, se hizo uso de la biblioteca “Sentence Transformers” del lenguaje de programación Python, además se construyó un entorno de trabajo análogo al módulo presente en el sistema “Gamivity”, en la plataforma de desarrollo “Google Colab” para poner a prueba dichos modelos, los criterios para la elección del modelo candidato, se resumen en la eficacia a la hora de relacionar preguntas, así como el coste computacional a la hora de realizar las operaciones involucradas en dicho proceso. A partir de la metodología aplicada, el modelo que mejor resultados generó fue “paraphrase-multilingual-MiniLM L12-v2”, entrenado con un gran corpus de texto en español, así como de otros 50 idiomas, el cual mostró un grado de precisión óptimo a la hora de relacionar conceptualmente las preguntas proporcionadas, así como su relativo bajo coste computacional a la hora de efectuar dichas operaciones. Palabras Clave: sentence transformers, sentence similarity, relacionar preguntas, aprendizaje personalizado

    First order formalism for thick branes in modified gravity with Lagrange multiplier

    Full text link
    This work discuss the construction of braneworld solutions in modified gravity with Lagrange multipliers. We examine the general aspects of the model and present a first order formalism that help us to find analytic solutions of the equations of motion. We also investigate some explicit models, analyse linear stability of the metric and comment on how to relate models investigated in other works to the ones examined in the present study.Comment: 7 pages, 9 figures, it matches the published versio
    corecore