983,087 research outputs found
The reaction at threshold in chiral perturbation theory
In the framework of heavy baryon chiral perturbation theory, we give thIn the
framework of heavy baryon chiral perturbation theory, we give the chiral
expansion for the threshold amplitudes and to
quadratic order in the pion mass. The theoretical results agree within one
standard deviation with the empirical values. We also derive a relation between
the two threshold amplitudes of the reaction and the S--wave scattering lengths, and , respectively, to order
. We show that there are uncertainties mostly related to
resonance excitation which make an accurate determination of the
scattering length from the threshold amplitudes at present
very difficult. The situation is different in the isospin two final
state. Here, the chiral series converges and one finds consistent with the one--loop chiral perturbation theory prediction.Comment: 30 pp, LaTeX file, uses epsf, 6 figures (appended), corrections in
sections 5 and 6, conclusions unchange
Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)
We introduce a new structured kernel interpolation (SKI) framework, which
generalises and unifies inducing point methods for scalable Gaussian processes
(GPs). SKI methods produce kernel approximations for fast computations through
kernel interpolation. The SKI framework clarifies how the quality of an
inducing point approach depends on the number of inducing (aka interpolation)
points, interpolation strategy, and GP covariance kernel. SKI also provides a
mechanism to create new scalable kernel methods, through choosing different
kernel interpolation strategies. Using SKI, with local cubic kernel
interpolation, we introduce KISS-GP, which is 1) more scalable than inducing
point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for
substantial additional gains in scalability, without requiring any grid data,
and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n)
time and storage for GP inference. We evaluate KISS-GP for kernel matrix
approximation, kernel learning, and natural sound modelling.Comment: 19 pages, 4 figure
Operators for transforming kernels into quasi-local kernels that improve SVM accuracy
Motivated by the crucial role that locality plays in various learning approaches, we present, in the framework of kernel machines for classification, a novel family of operators on kernels able to integrate local information into any kernel obtaining quasi-local kernels. The quasi-local kernels maintain the possibly global properties of the input kernel and they increase the kernel value as the points get closer in the feature space of the input kernel, mixing the effect of the input kernel with a kernel which is local in the feature space of the input one. If applied on a local kernel the operators introduce an additional level of locality equivalent to use a local kernel with non-stationary kernel width. The operators accept two parameters that regulate the width of the exponential influence of points in the locality-dependent component and the balancing between the feature-space local component and the input kernel. We address the choice of these parameters with a data-dependent strategy. Experiments carried out with SVM applying the operators on traditional kernel functions on a total of 43 datasets with di®erent characteristics and application domains, achieve very good results supported by statistical significance
The Reaction at Threshold
We consider the chiral expansion for the reaction in
heavy baryon chiral perturbation theory. To order we derive novel
low--energy theorems that compare favorably with recent determinations of the
total cross sections for and .Comment: 7 pp, LateX (uses epsf.sty), 3 figures appended as ps files (split
off as ppnf1.ps,ppnf2.ps,ppnf3.ps), CRN 94/1
Kernel Mean Shrinkage Estimators
A mean function in a reproducing kernel Hilbert space (RKHS), or a kernel
mean, is central to kernel methods in that it is used by many classical
algorithms such as kernel principal component analysis, and it also forms the
core inference step of modern kernel methods that rely on embedding probability
distributions in RKHSs. Given a finite sample, an empirical average has been
used commonly as a standard estimator of the true kernel mean. Despite a
widespread use of this estimator, we show that it can be improved thanks to the
well-known Stein phenomenon. We propose a new family of estimators called
kernel mean shrinkage estimators (KMSEs), which benefit from both theoretical
justifications and good empirical performance. The results demonstrate that the
proposed estimators outperform the standard one, especially in a "large d,
small n" paradigm.Comment: 41 page
- …
