9,437 research outputs found

    The optimal assignment kernel is not positive definite

    Full text link
    We prove that the optimal assignment kernel, proposed recently as an attempt to embed labeled graphs and more generally tuples of basic data to a Hilbert space, is in fact not always positive definite

    Approximate kernel clustering

    Full text link
    In the kernel clustering problem we are given a large n×nn\times n positive semi-definite matrix A=(aij)A=(a_{ij}) with ∑i,j=1naij=0\sum_{i,j=1}^na_{ij}=0 and a small k×kk\times k positive semi-definite matrix B=(bij)B=(b_{ij}). The goal is to find a partition S1,...,SkS_1,...,S_k of {1,...n}\{1,... n\} which maximizes the quantity ∑i,j=1k(∑(i,j)∈Si×Sjaij)bij. \sum_{i,j=1}^k (\sum_{(i,j)\in S_i\times S_j}a_{ij})b_{ij}. We study the computational complexity of this generic clustering problem which originates in the theory of machine learning. We design a constant factor polynomial time approximation algorithm for this problem, answering a question posed by Song, Smola, Gretton and Borgwardt. In some cases we manage to compute the sharp approximation threshold for this problem assuming the Unique Games Conjecture (UGC). In particular, when BB is the 3×33\times 3 identity matrix the UGC hardness threshold of this problem is exactly 16π27\frac{16\pi}{27}. We present and study a geometric conjecture of independent interest which we show would imply that the UGC threshold when BB is the k×kk\times k identity matrix is 8π9(1−1k)\frac{8\pi}{9}(1-\frac{1}{k}) for every k≥3k\ge 3

    Estimating Local Function Complexity via Mixture of Gaussian Processes

    Full text link
    Real world data often exhibit inhomogeneity, e.g., the noise level, the sampling distribution or the complexity of the target function may change over the input space. In this paper, we try to isolate local function complexity in a practical, robust way. This is achieved by first estimating the locally optimal kernel bandwidth as a functional relationship. Specifically, we propose Spatially Adaptive Bandwidth Estimation in Regression (SABER), which employs the mixture of experts consisting of multinomial kernel logistic regression as a gate and Gaussian process regression models as experts. Using the locally optimal kernel bandwidths, we deduce an estimate to the local function complexity by drawing parallels to the theory of locally linear smoothing. We demonstrate the usefulness of local function complexity for model interpretation and active learning in quantum chemistry experiments and fluid dynamics simulations.Comment: 19 pages, 16 figure
    • …
    corecore