235 research outputs found

    An analysis of minimax facility location problems with area demands /

    Get PDF
    The unconstrained model, and its solution technique can be easily modified to solve the limiting case where all facilities are fixed points, and also the case when metric constraints are added.Examples are solved to show the impact of assuming area demands, the conflicting nature of the minimax and minisum criteria and to illustrate the solutions techniques developed.A minimax objective function constrained by a bound on the total average cost of servicing all existing facilities (minisum function) is then discussed. Using duality properties, this problem is shown to be equivalent to another model which minimizes the minisum function subject to a bound on the same minimax function. This last problem proves to be easier to solve, and a specialized solution technique is developed. The resulting solutions are nondominated solutions in relation to the two criteria involved. Another way to generate nondominated solutions is by combining the two functions into a weighted sum. The constrained criterion method is shown to be superior both analytically and practically.Most probabilistic facility location problems investigated to date were variations of the generalized Weber formulation. In this research, several single facility minimax location models are analyzed, where both the weights and the locations of the existing facilities are random variables. The demand points are uniformly distributed over rectangular areas, the rectilinear metric is used and the weights are assumed to be independently distributed random variables. Two unconstrained probabilistic models are analyzed and compared to the centroid formulation, it is seen that the probabilistic models are sensitive to deviations from optimal solutions. An expected value criterion formulation is also presented along with lower and upper bound approximating functions

    Can Evolutionary Clustering Have Theoretical Guarantees?

    Full text link
    Clustering is a fundamental problem in many areas, which aims to partition a given data set into groups based on some distance measure, such that the data points in the same group are similar while that in different groups are dissimilar. Due to its importance and NP-hardness, a lot of methods have been proposed, among which evolutionary algorithms are a class of popular ones. Evolutionary clustering has found many successful applications, but all the results are empirical, lacking theoretical support. This paper fills this gap by proving that the approximation performance of the GSEMO (a simple multi-objective evolutionary algorithm) for solving four formulations of clustering, i.e., kk-tMM, kk-center, discrete kk-median and kk-means, can be theoretically guaranteed. Furthermore, we consider clustering under fairness, which tries to avoid algorithmic bias, and has recently been an important research topic in machine learning. We prove that for discrete kk-median clustering under individual fairness, the approximation performance of the GSEMO can be theoretically guaranteed with respect to both the objective function and the fairness constraint.Comment: 16 page

    Guest Editorial: Special Issue on Quantitative Approaches to Environmental Sustainability in Transportation Networks

    Get PDF
    postprin

    Supply constrained location-distribution in not-for-profit settings

    Get PDF
    Inspired by the World Food Programme's activity in the post-civil war food crisis in Angola, this study proposes a systematic approach to address the location distribution problem in not-for-profit settings, where a limited volume of supply has to be allocated to different demand regions. The use of utility functions is key in our framework because it allows the decision-maker to establish priorities by representing the heterogeneous effects of distributing supply to different demand locations (location effect) and to different individuals in the same demand location (diminishing returns effect). We propose the use of two fractional objectives with the utility functions embedded into them: an efficiency measure and a new inequity measure related to the Gini coefficient. The suggested problem has the form of a bi-objective integer linear fractional program and our resolution optimization technique is designed to solve for multiple fractional objective measures. Novel analytical results for the worst-case performance of the proposed resolution technique are provided. Our numerical experiments assess computational efficiency and provide concrete managerial prescriptions. Finally, an illustrative application of our approach in the context of the food crisis in Angola is presented based on an efficiency-inequity trade-off analysis.This research was partially supported by Sejong University[Grant 20180391] (Park) and Purdue University [Doug andMaria DeVos] (Berenguer)

    Bicriteria Approximation Algorithms for Priority Matroid Median

    Get PDF
    Fairness considerations have motivated new clustering problems and algorithms in recent years. In this paper we consider the Priority Matroid Median problem which generalizes the Priority k-Median problem that has recently been studied. The input consists of a set of facilities ? and a set of clients ? that lie in a metric space (? ? ?,d), and a matroid ? = (?,?) over the facilities. In addition, each client j has a specified radius r_j ? 0 and each facility i ? ? has an opening cost f_i > 0. The goal is to choose a subset S ? ? of facilities to minimize ?_{i ? ?} f_i + ?_{j ? ?} d(j,S) subject to two constraints: (i) S is an independent set in ? (that is S ? ?) and (ii) for each client j, its distance to an open facility is at most r_j (that is, d(j,S) ? r_j). For this problem we describe the first bicriteria (c?,c?) approximations for fixed constants c?,c?: the radius constraints of the clients are violated by at most a factor of c? and the objective cost is at most c? times the optimum cost. We also improve the previously known bicriteria approximation for the uniform radius setting (r_j : = L ? j ? ?)

    On Algorithmic Fairness and Stochastic Models for Combinatorial Optimization and Unsupervised Machine Learning

    Get PDF
    Combinatorial optimization and unsupervised machine learning problems have been extensively studied and are relatively well-understood. Examples of such problems that play a central role in this work are clustering problems and problems of finding cuts in graphs. The goal of the research presented in this dissertation is to introduce novel variants of the aforementioned problems, by generalizing their classic variants into two, not necessarily disjoint, directions. The first direction involves incorporating fairness aspects to a problem's specifications, and the second involves the introduction of some form of randomness in the problem definition, e.g., stochastic uncertainty about the problem's parameters. Fairness in the design of algorithms and in machine learning has received a significant amount of attention during the last few years, mainly due to the realization that standard optimization approaches can frequently lead to severely unfair outcomes, that can potentially hurt the individuals or the groups involved in the corresponding application. As far as considerations of fairness are concerned, in this work we begin by presenting two novel individually-fair clustering models, together with algorithms with provable guarantees for them. The first such model exploits randomness in order to provide fair solutions, while the second is purely deterministic. The high-level motivation behind both of them is trying to treat similar individuals similarly. Moving forward, we focus on a graph cut problem that captures situations of disaster containment in a network. For this problem we introduce two novel fair variants. The first variant focuses on demographic fairness, while the second considers a probabilistic notion of individual fairness. Again, we give algorithms with provable guarantees for the newly introduced variants. In the next part of this thesis we turn our attention to generalizing problems through the introduction of stochasticity. At first, we present algorithmic results for a computational epidemiology problem, whose goal is to control the stochastic diffusion of a disease in a contact network. This problem can be interpreted as a stochastic generalization of a static graph cut problem. Finally, this dissertation also includes work on a well-known paradigm in stochastic optimization, namely the two-stage stochastic setting with recourse. Two-stage problems capture a wide variety of applications revolving around the trade-off between provisioning and rapid response. In this setting, we present a family of clustering problems that had not yet been studied in the literature, and for this family we show novel algorithmic techniques that provide constant factor approximation algorithms. We conclude the dissertation with a discussion on open problems and future research directions in the general area of algorithmic fairness
    corecore