237 research outputs found

    Shaping Level Sets with Submodular Functions

    Get PDF
    International audienceWe consider a class of sparsity-inducing regularization terms based on submodular functions. While previous work has focused on non-decreasing functions, we explore symmetric submodular functions and their \lova extensions. We show that the Lovasz extension may be seen as the convex envelope of a function that depends on level sets (i.e., the set of indices whose corresponding components of the underlying predictor are greater than a given constant): this leads to a class of convex structured regularization terms that impose prior knowledge on the level sets, and not only on the supports of the underlying predictors. We provide a unified set of optimization algorithms, such as proximal operators, and theoretical guarantees (allowed level sets and recovery conditions). By selecting specific submodular functions, we give a new interpretation to known norms, such as the total variation; we also define new norms, in particular ones that are based on order statistics with application to clustering and outlier detection, and on noisy cuts in graphs with application to change point detection in the presence of outliers

    An Algorithmic Theory of Dependent Regularizers, Part 1: Submodular Structure

    Full text link
    We present an exploration of the rich theoretical connections between several classes of regularized models, network flows, and recent results in submodular function theory. This work unifies key aspects of these problems under a common theory, leading to novel methods for working with several important models of interest in statistics, machine learning and computer vision. In Part 1, we review the concepts of network flows and submodular function optimization theory foundational to our results. We then examine the connections between network flows and the minimum-norm algorithm from submodular optimization, extending and improving several current results. This leads to a concise representation of the structure of a large class of pairwise regularized models important in machine learning, statistics and computer vision. In Part 2, we describe the full regularization path of a class of penalized regression problems with dependent variables that includes the graph-guided LASSO and total variation constrained models. This description also motivates a practical algorithm. This allows us to efficiently find the regularization path of the discretized version of TV penalized models. Ultimately, our new algorithms scale up to high-dimensional problems with millions of variables

    Optimizing the Recency-Relevancy Trade-off in Online News Recommendations

    No full text

    Trade and Divergence in Education Systems

    Get PDF
    This paper presents a theory on the endogenous choice of a country's education policy and the two-way causal relationship between trade and education systems. The setting of a country's education system determines its talent distribution and comparative advantage in trade; the possibility of trade by raising the returns to the sector of comparative advantage in turn induces countries to further differentiate their education systems and reinforces the initial pattern of comparative advantage. SpeciÂ…cally, the Nash equilibrium choice of education systems by two countries interacting strategically are necessarily more divergent than their autarky choices,although the difference is still less than what is socially optimal for the world. We provide some preliminary empirical evidence on the relationship between education, talent distribution, and trade.Education System, Talent Distribution, Comparative Advantage, Trade Pattern
    • …
    corecore