597 research outputs found

    Intractability of approximate multi-dimensional nonlinear optimization on independence systems

    Get PDF
    We consider optimization of nonlinear objective functions that balance dd linear criteria over nn-element independence systems presented by linear-optimization oracles. For d=1d=1, we have previously shown that an rr-best approximate solution can be found in polynomial time. Here, using an extended Erd\H{o}s-Ko-Rado theorem of Frankl, we show that for d=2d=2, finding a ρn\rho n-best solution requires exponential time

    K2-ABC: Approximate Bayesian Computation with Kernel Embeddings

    Get PDF
    Complicated generative models often result in a situation where computing the likelihood of observed data is intractable, while simulating from the conditional density given a parameter value is relatively easy. Approximate Bayesian Computation (ABC) is a paradigm that enables simulation-based posterior inference in such cases by measuring the similarity between simulated and observed data in terms of a chosen set of summary statistics. However, there is no general rule to construct sufficient summary statistics for complex models. Insufficient summary statistics will "leak" information, which leads to ABC algorithms yielding samples from an incorrect (partial) posterior. In this paper, we propose a fully nonparametric ABC paradigm which circumvents the need for manually selecting summary statistics. Our approach, K2-ABC, uses maximum mean discrepancy (MMD) as a dissimilarity measure between the distributions over observed and simulated data. MMD is easily estimated as the squared difference between their empirical kernel embeddings. Experiments on a simulated scenario and a real-world biological problem illustrate the effectiveness of the proposed algorithm
    corecore