32,290 research outputs found
Exact Bethe Ansatz solution for chains with non- invariant open boundary conditions
The Nested Bethe Ansatz is generalized to open and independent boundary
conditions depending on two continuous and two discrete free parameters. This
is used to find the exact eigenvectors and eigenvalues of the vertex
models and spin chains with such boundary conditions. The solution is
found for all diagonal families of solutions to the reflection equations in all
possible combinations. The Bethe ansatz equations are used to find de first
order finite size correction.Comment: Two references adde
Strange resonance poles from scattering below 1.8 GeV
In this work we present a determination of the mass, width and coupling of
the resonances that appear in kaon-pion scattering below 1.8 GeV. These are:
the much debated scalar -meson, nowdays known as , the
scalar , the and vectors, the spin-two
as well as the spin-three . The parameters will be
determined from the pole associated to each resonance by means of an analytic
continuation of the scattering amplitudes obtained in a recent and
precise data analysis constrained with dispersion relations, which were not
well satisfied in previous analyses. This analytic continuation will be
performed by means of Pad\'e approximants, thus avoiding a particular model for
the pole parameterization. We also pay particular attention to the evaluation
of uncertainties.Comment: 13 pages, 12 figures. Accepted version to appear in Eur. Phys. J. C.
Clarifications and references added, minor typos correcte
Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians
This paper presents a general and efficient framework for probabilistic
inference and learning from arbitrary uncertain information. It exploits the
calculation properties of finite mixture models, conjugate families and
factorization. Both the joint probability density of the variables and the
likelihood function of the (objective or subjective) observation are
approximated by a special mixture model, in such a way that any desired
conditional distribution can be directly obtained without numerical
integration. We have developed an extended version of the expectation
maximization (EM) algorithm to estimate the parameters of mixture models from
uncertain training examples (indirect observations). As a consequence, any
piece of exact or uncertain information about both input and output values is
consistently handled in the inference and learning stages. This ability,
extremely useful in certain situations, is not found in most alternative
methods. The proposed framework is formally justified from standard
probabilistic principles and illustrative examples are provided in the fields
of nonparametric pattern classification, nonlinear regression and pattern
completion. Finally, experiments on a real application and comparative results
over standard databases provide empirical evidence of the utility of the method
in a wide range of applications
- …