789,305 research outputs found

    Distributions on partitions, point processes, and the hypergeometric kernel

    Full text link
    We study a 3-parametric family of stochastic point processes on the one-dimensional lattice originated from a remarkable family of representations of the infinite symmetric group. We prove that the correlation functions of the processes are given by determinantal formulas with a certain kernel. The kernel can be expressed through the Gauss hypergeometric function; we call it the hypergeometric kernel. In a scaling limit our processes approximate the processes describing the decomposition of representations mentioned above into irreducibles. As we showed before, see math.RT/9810015, the correlation functions of these limit processes also have determinantal form with so-called Whittaker kernel. We show that the scaling limit of the hypergeometric kernel is the Whittaker kernel. The integral operator corresponding to the Whittaker kernel is an integrable operator as defined by Its, Izergin, Korepin, and Slavnov. We argue that the hypergeometric kernel can be considered as a kernel defining a `discrete integrable operator'. We also show that the hypergeometric kernel degenerates for certain values of parameters to the Christoffel-Darboux kernel for Meixner orthogonal polynomials. This fact is parallel to the degeneration of the Whittaker kernel to the Christoffel-Darboux kernel for Laguerre polynomials.Comment: AMSTeX, 24 page

    Operators for transforming kernels into quasi-local kernels that improve SVM accuracy

    Get PDF
    Motivated by the crucial role that locality plays in various learning approaches, we present, in the framework of kernel machines for classification, a novel family of operators on kernels able to integrate local information into any kernel obtaining quasi-local kernels. The quasi-local kernels maintain the possibly global properties of the input kernel and they increase the kernel value as the points get closer in the feature space of the input kernel, mixing the effect of the input kernel with a kernel which is local in the feature space of the input one. If applied on a local kernel the operators introduce an additional level of locality equivalent to use a local kernel with non-stationary kernel width. The operators accept two parameters that regulate the width of the exponential influence of points in the locality-dependent component and the balancing between the feature-space local component and the input kernel. We address the choice of these parameters with a data-dependent strategy. Experiments carried out with SVM applying the operators on traditional kernel functions on a total of 43 datasets with di®erent characteristics and application domains, achieve very good results supported by statistical significance

    Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP)

    Full text link
    We introduce a new structured kernel interpolation (SKI) framework, which generalises and unifies inducing point methods for scalable Gaussian processes (GPs). SKI methods produce kernel approximations for fast computations through kernel interpolation. The SKI framework clarifies how the quality of an inducing point approach depends on the number of inducing (aka interpolation) points, interpolation strategy, and GP covariance kernel. SKI also provides a mechanism to create new scalable kernel methods, through choosing different kernel interpolation strategies. Using SKI, with local cubic kernel interpolation, we introduce KISS-GP, which is 1) more scalable than inducing point alternatives, 2) naturally enables Kronecker and Toeplitz algebra for substantial additional gains in scalability, without requiring any grid data, and 3) can be used for fast and expressive kernel learning. KISS-GP costs O(n) time and storage for GP inference. We evaluate KISS-GP for kernel matrix approximation, kernel learning, and natural sound modelling.Comment: 19 pages, 4 figure
    corecore