2,148 research outputs found

    Stabilized Benders methods for large-scale combinatorial optimization, with appllication to data privacy

    Get PDF
    The Cell Suppression Problem (CSP) is a challenging Mixed-Integer Linear Problem arising in statistical tabular data protection. Medium sized instances of CSP involve thousands of binary variables and million of continuous variables and constraints. However, CSP has the typical structure that allows application of the renowned Benders’ decomposition method: once the “complicating” binary variables are fixed, the problem decomposes into a large set of linear subproblems on the “easy” continuous ones. This allows to project away the easy variables, reducing to a master problem in the complicating ones where the value functions of the subproblems are approximated with the standard cutting-plane approach. Hence, Benders’ decomposition suffers from the same drawbacks of the cutting-plane method, i.e., oscillation and slow convergence, compounded with the fact that the master problem is combinatorial. To overcome this drawback we present a stabilized Benders decomposition whose master is restricted to a neighborhood of successful candidates by local branching constraints, which are dynamically adjusted, and even dropped, during the iterations. Our experiments with randomly generated and real-world CSP instances with up to 3600 binary variables, 90M continuous variables and 15M inequality constraints show that our approach is competitive with both the current state-of-the-art (cutting-plane-based) code for cell suppression, and the Benders implementation in CPLEX 12.7. In some instances, stabilized Benders is able to quickly provide a very good solution in less than one minute, while the other approaches were not able to find any feasible solution in one hour.Peer ReviewedPreprin

    Bicriteria data compression

    Get PDF
    The advent of massive datasets (and the consequent design of high-performing distributed storage systems) have reignited the interest of the scientific and engineering community towards the design of lossless data compressors which achieve effective compression ratio and very efficient decompression speed. Lempel-Ziv's LZ77 algorithm is the de facto choice in this scenario because of its decompression speed and its flexibility in trading decompression speed versus compressed-space efficiency. Each of the existing implementations offers a trade-off between space occupancy and decompression speed, so software engineers have to content themselves by picking the one which comes closer to the requirements of the application in their hands. Starting from these premises, and for the first time in the literature, we address in this paper the problem of trading optimally, and in a principled way, the consumption of these two resources by introducing the Bicriteria LZ77-Parsing problem, which formalizes in a principled way what data-compressors have traditionally approached by means of heuristics. The goal is to determine an LZ77 parsing which minimizes the space occupancy in bits of the compressed file, provided that the decompression time is bounded by a fixed amount (or vice-versa). This way, the software engineer can set its space (or time) requirements and then derive the LZ77 parsing which optimizes the decompression speed (or the space occupancy, respectively). We solve this problem efficiently in O(n log^2 n) time and optimal linear space within a small, additive approximation, by proving and deploying some specific structural properties of the weighted graph derived from the possible LZ77-parsings of the input file. The preliminary set of experiments shows that our novel proposal dominates all the highly engineered competitors, hence offering a win-win situation in theory&practice

    Valutazione della flora micotica superficiale in suini mantenuti in diverse condizioni di allevamento

    Get PDF
    Il patrimonio suinicolo italiano rappresenta, con 9,3 milioni di capi, il 5,8% dei suini allevati nell’Unione Europea. Accanto ai circa 5000 allevamenti intensivi di maiali bianchi di razze pesanti, si collocano nel nostro paese, piccoli allevamenti di razze autoctone, che nel complesso rappresentano il 5% del totale. In letteratura non sono riportate segnalazioni inerenti la presenza di miceti potenzialmente zoonotici. Nella presente tesi è stata condotta un’ indagine colturale su 192 suini allo scopo di caratterizzare la flora micotica superficiale in animali mantenuti in diverse tipologie di allevamento: brado, semibrado, intensivo. I campioni sono stati prelevati da cute, condotto uditivo esterno e congiuntiva di animali di razza Cinta Senese, Large White e Landrace, “ibridi commerciali” e cinghiali. La cute di un solo animale è risultata positiva per Microsporum gypseum, indicando la scarsa recettività della specie suina ai dermatofiti. Dalle colture ottenute dai tamponi oculari sono stati isolati numerose specie di lieviti potenzialmente zoonotici, mentre da quelle ottenute dai tamponi auricolari sono state tipizzate sia Malassezie “non lipido dipendenti” (M. pachydermatis), che Malassezie lipido dipendenti (M. sympodialis e M. furfur) potenzialmente responsabili di zoonosi. Queste ultime sono state riscontrate soltanto nei suini di razza pesante, con differenze significative

    Generalized Bundle Methods

    Get PDF
    We study a class of generalized bundle methods for which the stabilizing term can be any closed convex function satisfying certain properties. This setting covers several algorithms from the literature that have been so far regarded as distinct. Under a different hypothesis on the stabilizing term and/or the function to be minimized, we prove finite termination, asymptotic convergence, and finite convergence to an optimal point, with or without limits on the number of serious steps and/or requiring the proximal parameter to go to infinity. The convergence proofs leave a high degree of freedom in the crucial implementative features of the algorithm, i.e., the management of the bundle of subgradients (β-strategy) and of the proximal parameter (t-strategy). We extensively exploit a dual view of bundle methods, which are shown to be a dual ascent approach to one nonlinear problem in an appropriate dual space, where nonlinear subproblems are approximately solved at each step with an inner linearization approach. This allows us to precisely characterize the changes in the subproblems during the serious steps, since the dual problem is not tied to the local concept of ε-subdifferential. For some of the proofs, a generalization of inf-compactness, called *-compactness, is required; this concept is related to that of asymptotically well-behaved functions

    Standard Bundle Methods: Untrusted Models and Duality

    Get PDF
    We review the basic ideas underlying the vast family of algorithms for nonsmooth convex optimization known as "bundle methods|. In a nutshell, these approaches are based on constructing models of the function, but lack of continuity of first-order information implies that these models cannot be trusted, not even close to an optimum. Therefore, many different forms of stabilization have been proposed to try to avoid being led to areas where the model is so inaccurate as to result in almost useless steps. In the development of these methods, duality arguments are useful, if not outright necessary, to better analyze the behaviour of the algorithms. Also, in many relevant applications the function at hand is itself a dual one, so that duality allows to map back algorithmic concepts and results into a "primal space" where they can be exploited; in turn, structure in that space can be exploited to improve the algorithms' behaviour, e.g. by developing better models. We present an updated picture of the many developments around the basic idea along at least three different axes: form of the stabilization, form of the model, and approximate evaluation of the function

    Optimization Methods: an Applications-Oriented Primer

    Get PDF
    Effectively sharing resources requires solving complex decision problems. This requires constructing a mathematical model of the underlying system, and then applying appropriate mathematical methods to find an optimal solution of the model, which is ultimately translated into actual decisions. The development of mathematical tools for solving optimization problems dates back to Newton and Leibniz, but it has tremendously accelerated since the advent of digital computers. Today, optimization is an inter-disciplinary subject, lying at the interface between management science, computer science, mathematics and engineering. This chapter offers an introduction to the main theoretical and software tools that are nowadays available to practitioners to solve the kind of optimization problems that are more likely to be encountered in the context of this book. Using, as a case study, a simplified version of the bike sharing problem, we guide the reader through the discussion of modelling and algorithmic issues, concentrating on methods for solving optimization problems to proven optimality

    A Stabilized Structured Dantzig-Wolfe Decomposition Method

    Get PDF
    We discuss an algorithmic scheme, which we call the stabilized structured Dantzig-Wolfe decomposition method, for solving large-scale structured linear programs. It can be applied when the subproblem of the standard Dantzig-Wolfe approach admits an alternative master model amenable to column generation, other than the standard one in which there is a variable for each of the extreme points and extreme rays of the corresponding polyhedron. Stabilization is achieved by the same techniques developed for the standard Dantzig-Wolfe approach and it is equally useful to improve the performance, as shown by computational results obtained on an application to the multicommodity capacitated network design problem

    Prim-based Support-Graph Preconditioners for Min-Cost Flow Problems

    Get PDF
    Support-graph preconditioners have been shown to be a valuable tool for the iterative solution, via a Preconditioned Conjugate Gradient method, of the KKT systems that must be solved at each iteration of an Interior Point algorithm for the solution of Min Cost Flow problems. These preconditioners extract a proper triangulated subgraph, with ``large'' weight, of the original graph: in practice, trees and Brother-Connected Trees (BCTs) of depth two have been shown to be the most computationally efficient families of subgraphs. In the literature, approximate versions of the Kruskal algorithm for maximum-weight spanning trees have most often been used for choosing the subgraphs; Prim-based approaches have been used for trees, but no comparison have ever been reported. We propose Prim-based heuristics for BCTs, which require nontrivial modifications w.r.t. the previously proposed Kruskal-based approaches, and present a computational comparison of the different approaches, which shows that Prim-based heuristics are most often preferable to Kruskal-based ones

    Incremental bundle methods using upper models

    Get PDF
    We propose a family of proximal bundle methods for minimizing sum-structured convex nondifferentiable functions which require two slightly uncommon assumptions, that are satisfied in many relevant applications: Lipschitz continuity of the functions and oracles which also produce upper estimates on the function values. In exchange, the methods: i) use upper models of the functions that allow to estimate function values at points where the oracle has not been called; ii) provide the oracles with more information about when the function computation can be interrupted, possibly diminishing their cost; iii) allow to skip oracle calls entirely for some of the component functions, not only at "null steps" but also at "serious steps"; iv) provide explicit and reliable a-posteriori estimates of the quality of the obtained solutions; v) work with all possible combinations of different assumptions on how the oracles deal with not being able to compute the function with arbitrary accuracy. We also discuss the introduction of constraints (or, more generally, of easy components) and use of (partly) aggregated models
    • …
    corecore