227 research outputs found

    Convex Integer Optimization by Constantly Many Linear Counterparts

    Full text link
    In this article we study convex integer maximization problems with composite objective functions of the form f(Wx)f(Wx), where ff is a convex function on Rd\R^d and WW is a d×nd\times n matrix with small or binary entries, over finite sets S⊂ZnS\subset \Z^n of integer points presented by an oracle or by linear inequalities. Continuing the line of research advanced by Uri Rothblum and his colleagues on edge-directions, we introduce here the notion of {\em edge complexity} of SS, and use it to establish polynomial and constant upper bounds on the number of vertices of the projection \conv(WS) and on the number of linear optimization counterparts needed to solve the above convex problem. Two typical consequences are the following. First, for any dd, there is a constant m(d)m(d) such that the maximum number of vertices of the projection of any matroid S⊂{0,1}nS\subset\{0,1\}^n by any binary d×nd\times n matrix WW is m(d)m(d) regardless of nn and SS; and the convex matroid problem reduces to m(d)m(d) greedily solvable linear counterparts. In particular, m(2)=8m(2)=8. Second, for any d,l,md,l,m, there is a constant t(d;l,m)t(d;l,m) such that the maximum number of vertices of the projection of any three-index l×m×nl\times m\times n transportation polytope for any nn by any binary d×(l×m×n)d\times(l\times m\times n) matrix WW is t(d;l,m)t(d;l,m); and the convex three-index transportation problem reduces to t(d;l,m)t(d;l,m) linear counterparts solvable in polynomial time

    Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions

    Full text link
    We investigate three related and important problems connected to machine learning: approximating a submodular function everywhere, learning a submodular function (in a PAC-like setting [53]), and constrained minimization of submodular functions. We show that the complexity of all three problems depends on the 'curvature' of the submodular function, and provide lower and upper bounds that refine and improve previous results [3, 16, 18, 52]. Our proof techniques are fairly generic. We either use a black-box transformation of the function (for approximation and learning), or a transformation of algorithms to use an appropriate surrogate function (for minimization). Curiously, curvature has been known to influence approximations for submodular maximization [7, 55], but its effect on minimization, approximation and learning has hitherto been open. We complete this picture, and also support our theoretical claims by empirical results.Comment: 21 pages. A shorter version appeared in Advances of NIPS-201

    Non-acyclicity of coset lattices and generation of finite groups

    Get PDF
    • …
    corecore