755 research outputs found

    Generalized Cut-Set Bounds for Broadcast Networks

    Full text link
    A broadcast network is a classical network with all source messages collocated at a single source node. For broadcast networks, the standard cut-set bounds, which are known to be loose in general, are closely related to union as a specific set operation to combine the basic cuts of the network. This paper provides a new set of network coding bounds for general broadcast networks. These bounds combine the basic cuts of the network via a variety of set operations (not just the union) and are established via only the submodularity of Shannon entropy. The tightness of these bounds are demonstrated via applications to combination networks.Comment: 30 pages, 4 figures, submitted to the IEEE Transaction on Information Theor

    Convex Analysis and Optimization with Submodular Functions: a Tutorial

    Get PDF
    Set-functions appear in many areas of computer science and applied mathematics, such as machine learning, computer vision, operations research or electrical networks. Among these set-functions, submodular functions play an important role, similar to convex functions on vector spaces. In this tutorial, the theory of submodular functions is presented, in a self-contained way, with all results shown from first principles. A good knowledge of convex analysis is assumed

    On Unconstrained Quasi-Submodular Function Optimization

    Full text link
    With the extensive application of submodularity, its generalizations are constantly being proposed. However, most of them are tailored for special problems. In this paper, we focus on quasi-submodularity, a universal generalization, which satisfies weaker properties than submodularity but still enjoys favorable performance in optimization. Similar to the diminishing return property of submodularity, we first define a corresponding property called the {\em single sub-crossing}, then we propose two algorithms for unconstrained quasi-submodular function minimization and maximization, respectively. The proposed algorithms return the reduced lattices in O(n)\mathcal{O}(n) iterations, and guarantee the objective function values are strictly monotonically increased or decreased after each iteration. Moreover, any local and global optima are definitely contained in the reduced lattices. Experimental results verify the effectiveness and efficiency of the proposed algorithms on lattice reduction.Comment: 11 page

    Curvature and Optimal Algorithms for Learning and Minimizing Submodular Functions

    Full text link
    We investigate three related and important problems connected to machine learning: approximating a submodular function everywhere, learning a submodular function (in a PAC-like setting [53]), and constrained minimization of submodular functions. We show that the complexity of all three problems depends on the 'curvature' of the submodular function, and provide lower and upper bounds that refine and improve previous results [3, 16, 18, 52]. Our proof techniques are fairly generic. We either use a black-box transformation of the function (for approximation and learning), or a transformation of algorithms to use an appropriate surrogate function (for minimization). Curiously, curvature has been known to influence approximations for submodular maximization [7, 55], but its effect on minimization, approximation and learning has hitherto been open. We complete this picture, and also support our theoretical claims by empirical results.Comment: 21 pages. A shorter version appeared in Advances of NIPS-201
    • …
    corecore