2,415 research outputs found

    On the expected diameter, width, and complexity of a stochastic convex-hull

    Full text link
    We investigate several computational problems related to the stochastic convex hull (SCH). Given a stochastic dataset consisting of nn points in Rd\mathbb{R}^d each of which has an existence probability, a SCH refers to the convex hull of a realization of the dataset, i.e., a random sample including each point with its existence probability. We are interested in computing certain expected statistics of a SCH, including diameter, width, and combinatorial complexity. For diameter, we establish the first deterministic 1.633-approximation algorithm with a time complexity polynomial in both nn and dd. For width, two approximation algorithms are provided: a deterministic O(1)O(1)-approximation running in O(nd+1logn)O(n^{d+1} \log n) time, and a fully polynomial-time randomized approximation scheme (FPRAS). For combinatorial complexity, we propose an exact O(nd)O(n^d)-time algorithm. Our solutions exploit many geometric insights in Euclidean space, some of which might be of independent interest

    Algorithms and hardness results for geometric problems on stochastic datasets

    Get PDF
    University of Minnesota Ph.D. dissertation.July 2019. Major: Computer Science. Advisor: Ravi Janardan. 1 computer file (PDF); viii, 121 pages.Traditionally, geometric problems are studied on datasets in which each data object exists with probability 1 at its location in the underlying space. However, in many scenarios, there may be some uncertainty associated with the existence or the locations of the data points. Such uncertain datasets, called \textit{stochastic datasets}, are often more realistic, as they are more expressive and can model the real data more precisely. For this reason, geometric problems on stochastic datasets have received significant attention in recent years. This thesis studies three sets of geometric problems on stochastic datasets equipped with existential uncertainty. The first set of problems addresses the linear separability of a bichromatic stochastic dataset. Specifically, these problems are concerned with how to compute the probability that a realization of a bichromatic stochastic dataset is linearly separable as well as how to compute the expected separation-margin of such a realization. The second set of problems deals with the stochastic convex hull, i.e., the convex hull of a stochastic dataset. This includes computing the expected measures of a stochastic convex hull, such as the expected diameter, width, and combinatorial complexity. The third set of problems considers the dominance relation in a colored stochastic dataset. These problems involve computing the probability that a realization of a colored stochastic dataset does not contain any dominance pair consisting of two different-colored points. New algorithmic and hardness results are provided for the three sets of problems

    Conditional Gradient Methods

    Full text link
    The purpose of this survey is to serve both as a gentle introduction and a coherent overview of state-of-the-art Frank--Wolfe algorithms, also called conditional gradient algorithms, for function minimization. These algorithms are especially useful in convex optimization when linear optimization is cheaper than projections. The selection of the material has been guided by the principle of highlighting crucial ideas as well as presenting new approaches that we believe might become important in the future, with ample citations even of old works imperative in the development of newer methods. Yet, our selection is sometimes biased, and need not reflect consensus of the research community, and we have certainly missed recent important contributions. After all the research area of Frank--Wolfe is very active, making it a moving target. We apologize sincerely in advance for any such distortions and we fully acknowledge: We stand on the shoulder of giants.Comment: 238 pages with many figures. The FrankWolfe.jl Julia package (https://github.com/ZIB-IOL/FrankWolfe.jl) providces state-of-the-art implementations of many Frank--Wolfe method

    Estimation in high dimensions: a geometric perspective

    Full text link
    This tutorial provides an exposition of a flexible geometric framework for high dimensional estimation problems with constraints. The tutorial develops geometric intuition about high dimensional sets, justifies it with some results of asymptotic convex geometry, and demonstrates connections between geometric results and estimation problems. The theory is illustrated with applications to sparse recovery, matrix completion, quantization, linear and logistic regression and generalized linear models.Comment: 56 pages, 9 figures. Multiple minor change

    Boosting Variational Inference: an Optimization Perspective

    Full text link
    Variational inference is a popular technique to approximate a possibly intractable Bayesian posterior with a more tractable one. Recently, boosting variational inference has been proposed as a new paradigm to approximate the posterior by a mixture of densities by greedily adding components to the mixture. However, as is the case with many other variational inference algorithms, its theoretical properties have not been studied. In the present work, we study the convergence properties of this approach from a modern optimization viewpoint by establishing connections to the classic Frank-Wolfe algorithm. Our analyses yields novel theoretical insights regarding the sufficient conditions for convergence, explicit rates, and algorithmic simplifications. Since a lot of focus in previous works for variational inference has been on tractability, our work is especially important as a much needed attempt to bridge the gap between probabilistic models and their corresponding theoretical properties

    On the Smoothed Complexity of Convex Hulls

    Get PDF
    We establish an upper bound on the smoothed complexity of convex hulls in R^d under uniform Euclidean (L^2) noise. Specifically, let {p_1^*, p_2^*, ..., p_n^*} be an arbitrary set of n points in the unit ball in R^d and let p_i = p_i^* + x_i, where x_1, x_2, ..., x_n are chosen independently from the unit ball of radius r. We show that the expected complexity, measured as the number of faces of all dimensions, of the convex hull of {p_1, p_2, ..., p_n} is O(n^{2-4/(d+1)} (1+1/r)^{d-1}); the magnitude r of the noise may vary with n. For d=2 this bound improves to O(n^{2/3} (1+r^{-2/3})). We also analyze the expected complexity of the convex hull of L^2 and Gaussian perturbations of a nice sample of a sphere, giving a lower-bound for the smoothed complexity. We identify the different regimes in terms of the scale, as a function of n, and show that as the magnitude of the noise increases, that complexity varies monotonically for Gaussian noise but non-monotonically for L^2 noise
    corecore