983,087 research outputs found

    The reaction πNππN\pi N \to \pi \pi N at threshold in chiral perturbation theory

    Full text link
    In the framework of heavy baryon chiral perturbation theory, we give thIn the framework of heavy baryon chiral perturbation theory, we give the chiral expansion for the πNππN\pi N \to \pi \pi N threshold amplitudes D1D_1 and D2D_2 to quadratic order in the pion mass. The theoretical results agree within one standard deviation with the empirical values. We also derive a relation between the two threshold amplitudes of the reaction πNππN\pi N \to \pi \pi N and the ππ\pi \pi S--wave scattering lengths, a00a_0^0 and a02a_0^2, respectively, to order O(Mπ2){\cal O}(M_\pi^2). We show that there are uncertainties mostly related to resonance excitation which make an accurate determination of the ππ\pi \pi scattering length a00a_0^0 from the ππN\pi \pi N threshold amplitudes at present very difficult. The situation is different in the ππ\pi \pi isospin two final state. Here, the chiral series converges and one finds a02=0.031±0.007a_0^2 = -0.031 \pm 0.007 consistent with the one--loop chiral perturbation theory prediction.Comment: 30 pp, LaTeX file, uses epsf, 6 figures (appended), corrections in sections 5 and 6, conclusions unchange

    Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)

    Full text link
    We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.Comment: 19 pages, 4 figure

    Operators for transforming kernels into quasi-local kernels that improve SVM accuracy

    Get PDF
    Motivated by the crucial role that locality plays in various learning approaches, we present, in the framework of kernel machines for classification, a novel family of operators on kernels able to integrate local information into any kernel obtaining quasi-local kernels. The quasi-local kernels maintain the possibly global properties of the input kernel and they increase the kernel value as the points get closer in the feature space of the input kernel, mixing the effect of the input kernel with a kernel which is local in the feature space of the input one. If applied on a local kernel the operators introduce an additional level of locality equivalent to use a local kernel with non-stationary kernel width. The operators accept two parameters that regulate the width of the exponential influence of points in the locality-dependent component and the balancing between the feature-space local component and the input kernel. We address the choice of these parameters with a data-dependent strategy. Experiments carried out with SVM applying the operators on traditional kernel functions on a total of 43 datasets with di®erent characteristics and application domains, achieve very good results supported by statistical significance

    The Reaction πNππN\pi N \to \pi \pi N at Threshold

    Full text link
    We consider the chiral expansion for the reaction πNππN\pi N \to \pi \pi N in heavy baryon chiral perturbation theory. To order MπM_\pi we derive novel low--energy theorems that compare favorably with recent determinations of the total cross sections for π+pπ+π+n\pi^+ p \to \pi^+ \pi^+ n and πpπ0π0n\pi^- p \to \pi^0 \pi^0 n.Comment: 7 pp, LateX (uses epsf.sty), 3 figures appended as ps files (split off as ppnf1.ps,ppnf2.ps,ppnf3.ps), CRN 94/1

    Kernel Mean Shrinkage Estimators

    Get PDF
    A mean function in a reproducing kernel Hilbert space (RKHS), or a kernel mean, is central to kernel methods in that it is used by many classical algorithms such as kernel principal component analysis, and it also forms the core inference step of modern kernel methods that rely on embedding probability distributions in RKHSs. Given a finite sample, an empirical average has been used commonly as a standard estimator of the true kernel mean. Despite a widespread use of this estimator, we show that it can be improved thanks to the well-known Stein phenomenon. We propose a new family of estimators called kernel mean shrinkage estimators (KMSEs), which benefit from both theoretical justifications and good empirical performance. The results demonstrate that the proposed estimators outperform the standard one, especially in a "large d, small n" paradigm.Comment: 41 page
    corecore