405 research outputs found

    Intrinsic Approximation on Cantor-like Sets, a Problem of Mahler

    Full text link
    In 1984, Kurt Mahler posed the following fundamental question: How well can irrationals in the Cantor set be approximated by rationals in the Cantor set? Towards development of such a theory, we prove a Dirichlet-type theorem for this intrinsic diophantine approximation on Cantor-like sets, and discuss related possible theorems/conjectures. The resulting approximation function is analogous to that for R^d, but with d being the Hausdorff dimension of the set, and logarithmic dependence on the denominator instead.Comment: 7 pages, 0 figure

    PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference

    Full text link
    Generalized linear models (GLMs) -- such as logistic regression, Poisson regression, and robust regression -- provide interpretable models for diverse data types. Probabilistic approaches, particularly Bayesian ones, allow coherent estimates of uncertainty, incorporation of prior information, and sharing of power across experiments via hierarchical models. In practice, however, the approximate Bayesian methods necessary for inference have either failed to scale to large data sets or failed to provide theoretical guarantees on the quality of inference. We propose a new approach based on constructing polynomial approximate sufficient statistics for GLMs (PASS-GLM). We demonstrate that our method admits a simple algorithm as well as trivial streaming and distributed extensions that do not compound error across computations. We provide theoretical guarantees on the quality of point (MAP) estimates, the approximate posterior, and posterior mean and uncertainty estimates. We validate our approach empirically in the case of logistic regression using a quadratic approximation and show competitive performance with stochastic gradient descent, MCMC, and the Laplace approximation in terms of speed and multiple measures of accuracy -- including on an advertising data set with 40 million data points and 20,000 covariates.Comment: In Proceedings of the 31st Annual Conference on Neural Information Processing Systems (NIPS 2017). v3: corrected typos in Appendix
    corecore