4 research outputs found

    Target-based Distributionally Robust Minimum Spanning Tree Problem

    Full text link
    Due to its broad applications in practice, the minimum spanning tree problem and its all kinds of variations have been studied extensively during the last decades, for which a host of efficient exact and heuristic algorithms have been proposed. Meanwhile, motivated by realistic applications, the minimum spanning tree problem in stochastic network has attracted considerable attention of researchers, with respect to which stochastic and robust spanning tree models and related algorithms have been continuingly developed. However, all of them would be either too restricted by the types of the edge weight random variables or computationally intractable, especially in large-scale networks. In this paper, we introduce a target-based distributionally robust optimization framework to solve the minimum spanning tree problem in stochastic graphs where the probability distribution function of the edge weight is unknown but some statistical information could be utilized to prevent the optimal solution from being too conservative. We propose two exact algorithms to solve it, based on Benders decomposition framework and a modified classical greedy algorithm of MST problem (Prim algorithm),respectively. Compared with the NP-hard stochastic and robust spanning tree problems,The proposed target-based distributionally robust minimum spanning tree problem enjoys more satisfactory algorithmic aspect and robustness, when faced with uncertainty in input data

    Regret Models and Preprocessing Techniques for Combinatorial Optimization under Uncertainty

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Stochastic Minimum Norm Combinatorial Optimization

    Get PDF
    Motivated by growing interest in optimization under uncertainty, we undertake a systematic study of designing approximation algorithms for a wide class of 1-stage stochastic-optimization problems with norm-based objective functions. We introduce the model of stochastic minimum norm combinatorial optimization, denoted StochNormOpt. We have a combinatorial-optimization problem where the costs involved are random variables with given distributions, and we are given a monotone, symmetric norm f. Each feasible solution induces a random multidimensional cost vector whose entries are independent random variables, and the goal is to find an oblivious solution (i.e., one that does not depend on the realizations of the costs) that minimizes the expected f-norm of the induced cost vector. We consider two concrete problem settings. In stochastic load balancing, jobs with random processing times need to be assigned to machines, and the induced cost vector is the machine-load vector, where the load on a machine is given by the sum of job random variables that are assigned to it. In stochastic spanning tree, we have a graph whose edges have stochastic weights, and the induced cost vector consists of edge-weight variables of edges that belong to the spanning tree. The class of monotone, symmetric norms is broad: it includes frequently-used objectives such as max-cost (infinity-norm) and sum-of-costs (1-norm), and more generally all p-norms and Top-k-norms (sum of k largest coordinates in absolute value). Closure properties under taking nonnegative linear combinations and pointwise maximums offer versatility to this class of norms. In particular, the latter closure-property can be used to incorporate multiple norm budget constraints through a single norm-minimization objective. Our chief contribution is a framework for designing approximation algorithms for stochastic minimum norm optimization, a significant generalization of the framework of Chakrabarty and Swamy [5] for the deterministic version of StochNormOpt. Our framework has two key components: (i) A reduction from the minimization of expected f-norm to the simultaneous minimization of a (small) collection of expected Top-k-norms; and (ii) Showing how to tackle the minimization of a single expected Top-k-norm by leveraging techniques used to deal with minimizing the expected maximum, circumventing the difficulties posed by the non-separable nature of Top-k norms. We apply our framework to obtain approximation algorithms for stochastic min-norm versions of load balancing (StochNormLB) and spanning tree (StochNormTree) problems. We highlight the following approximation guarantees. 1) An O(1)-approximation for StochNormLB on unrelated machines with: (i) arbitrary monotone symmetric norms and job sizes that are weighted Bernoulli random variables; and (ii) Top-k norms and arbitrary job-size distributions. 2) An O(log log m/log log log m)-approximation for general StochNormLB, where m is the number of machines. 3) For identical machines, the above approximation guarantees are in fact simultaneous approximations that hold with respect to every monotone, symmetric norm. 4) An O(1)-approximation for StochNormTree with an arbitrary monotone, symmetric norm and arbitrary edge-weight distributions; this guarantee extends to stochastic minimum-norm matroid basis. We also consider the special setting of StochNormOpt when the underlying random variables follow Poisson distributions. Our main result here is a novel and powerful reduction showing that, in essence, the stochastic minimum-norm problem can be reduced to a deterministic min-norm version of the same problem. Applying this reduction to (Poisson versions of) spanning tree and load balancing problems yields: (i) an optimal algorithm for StochNormTree; (ii) an almost 2-approximation for StochNormLB when the machines are unrelated, and (iii) a PTAS for StochNormLB when the machines are identical. Results (ii) and (iii) utilize approximation algorithms for (deterministic) min-norm load balancing from the work of Ibrahimpur and Swamy [19] in a black-box fashion
    corecore