32,290 research outputs found

    Exact Bethe Ansatz solution for An−1A_{n-1} chains with non-SUq(n)SU_{q}(n) invariant open boundary conditions

    Full text link
    The Nested Bethe Ansatz is generalized to open and independent boundary conditions depending on two continuous and two discrete free parameters. This is used to find the exact eigenvectors and eigenvalues of the An−1A_{n-1} vertex models and SU(n)SU(n) spin chains with such boundary conditions. The solution is found for all diagonal families of solutions to the reflection equations in all possible combinations. The Bethe ansatz equations are used to find de first order finite size correction.Comment: Two references adde

    Strange resonance poles from KÏ€K\pi scattering below 1.8 GeV

    Get PDF
    In this work we present a determination of the mass, width and coupling of the resonances that appear in kaon-pion scattering below 1.8 GeV. These are: the much debated scalar κ\kappa-meson, nowdays known as K0∗(800)K_0^*(800), the scalar K0∗(1430)K_0^*(1430), the K∗(892)K^*(892) and K1∗(1410)K_1^*(1410) vectors, the spin-two K2∗(1430)K_2^*(1430) as well as the spin-three K3∗(1780)K^*_3(1780). The parameters will be determined from the pole associated to each resonance by means of an analytic continuation of the KπK\pi scattering amplitudes obtained in a recent and precise data analysis constrained with dispersion relations, which were not well satisfied in previous analyses. This analytic continuation will be performed by means of Pad\'e approximants, thus avoiding a particular model for the pole parameterization. We also pay particular attention to the evaluation of uncertainties.Comment: 13 pages, 12 figures. Accepted version to appear in Eur. Phys. J. C. Clarifications and references added, minor typos correcte

    Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians

    Full text link
    This paper presents a general and efficient framework for probabilistic inference and learning from arbitrary uncertain information. It exploits the calculation properties of finite mixture models, conjugate families and factorization. Both the joint probability density of the variables and the likelihood function of the (objective or subjective) observation are approximated by a special mixture model, in such a way that any desired conditional distribution can be directly obtained without numerical integration. We have developed an extended version of the expectation maximization (EM) algorithm to estimate the parameters of mixture models from uncertain training examples (indirect observations). As a consequence, any piece of exact or uncertain information about both input and output values is consistently handled in the inference and learning stages. This ability, extremely useful in certain situations, is not found in most alternative methods. The proposed framework is formally justified from standard probabilistic principles and illustrative examples are provided in the fields of nonparametric pattern classification, nonlinear regression and pattern completion. Finally, experiments on a real application and comparative results over standard databases provide empirical evidence of the utility of the method in a wide range of applications
    • …
    corecore