142 research outputs found

    Throughput Maximization in the Speed-Scaling Setting

    Get PDF
    We are given a set of nn jobs and a single processor that can vary its speed dynamically. Each job JjJ_j is characterized by its processing requirement (work) pjp_j, its release date rjr_j and its deadline djd_j. We are also given a budget of energy EE and we study the scheduling problem of maximizing the throughput (i.e. the number of jobs which are completed on time). We propose a dynamic programming algorithm that solves the preemptive case of the problem, i.e. when the execution of the jobs may be interrupted and resumed later, in pseudo-polynomial time. Our algorithm can be adapted for solving the weighted version of the problem where every job is associated with a weight wjw_j and the objective is the maximization of the sum of the weights of the jobs that are completed on time. Moreover, we provide a strongly polynomial time algorithm to solve the non-preemptive unweighed case when the jobs have the same processing requirements. For the weighted case, our algorithm can be adapted for solving the non-preemptive version of the problem in pseudo-polynomial time.Comment: submitted to SODA 201

    オンラインナップサックと関連する諸問題に対するアルゴリズム論的研究

    Get PDF
    学位の種別:課程博士University of Tokyo(東京大学

    Online Knapsack Problems with a Resource Buffer

    Get PDF
    In this paper, we introduce online knapsack problems with a resource buffer. In the problems, we are given a knapsack with capacity 1, a buffer with capacity R >= 1, and items that arrive one by one. Each arriving item has to be taken into the buffer or discarded on its arrival irrevocably. When every item has arrived, we transfer a subset of items in the current buffer into the knapsack. Our goal is to maximize the total value of the items in the knapsack. We consider four variants depending on whether items in the buffer are removable (i.e., we can remove items in the buffer) or non-removable, and proportional (i.e., the value of each item is proportional to its size) or general. For the general&non-removable case, we observe that no constant competitive algorithm exists for any R >= 1. For the proportional&non-removable case, we show that a simple greedy algorithm is optimal for every R >= 1. For the general&removable and the proportional&removable cases, we present optimal algorithms for small R and give asymptotically nearly optimal algorithms for general R

    Approximating Geometric Knapsack via L-packings

    Full text link
    We study the two-dimensional geometric knapsack problem (2DK) in which we are given a set of n axis-aligned rectangular items, each one with an associated profit, and an axis-aligned square knapsack. The goal is to find a (non-overlapping) packing of a maximum profit subset of items inside the knapsack (without rotating items). The best-known polynomial-time approximation factor for this problem (even just in the cardinality case) is (2 + \epsilon) [Jansen and Zhang, SODA 2004]. In this paper, we break the 2 approximation barrier, achieving a polynomial-time (17/9 + \epsilon) < 1.89 approximation, which improves to (558/325 + \epsilon) < 1.72 in the cardinality case. Essentially all prior work on 2DK approximation packs items inside a constant number of rectangular containers, where items inside each container are packed using a simple greedy strategy. We deviate for the first time from this setting: we show that there exists a large profit solution where items are packed inside a constant number of containers plus one L-shaped region at the boundary of the knapsack which contains items that are high and narrow and items that are wide and thin. As a second major and the main algorithmic contribution of this paper, we present a PTAS for this case. We believe that this will turn out to be useful in future work in geometric packing problems. We also consider the variant of the problem with rotations (2DKR), where items can be rotated by 90 degrees. Also, in this case, the best-known polynomial-time approximation factor (even for the cardinality case) is (2 + \epsilon) [Jansen and Zhang, SODA 2004]. Exploiting part of the machinery developed for 2DK plus a few additional ideas, we obtain a polynomial-time (3/2 + \epsilon)-approximation for 2DKR, which improves to (4/3 + \epsilon) in the cardinality case.Comment: 64pages, full version of FOCS 2017 pape

    Improved Approximation Algorithms for Box Contact Representations ⋆

    Get PDF
    Abstract. We study the following geometric representation problem: Given a graph whose vertices correspond to axis-aligned rectangles with fixed dimensions, arrange the rectangles without overlaps in the plane such that two rectangles touch if the graph contains an edge between them. This problem is called CONTACT REPRESENTATION OF WORD NETWORKS (CROWN) since it formalizes the geometric problem behind drawing word clouds in which semantically related words are close to each other. CROWN is known to be NP-hard, and there are approximation algorithms for certain graph classes for the optimization version, MAX-CROWN, in which realizing each desired adjacency yields a certain profit. We present the first O(1)-approximation algorithm for the general case, when the input is a complete weighted graph, and for the bipartite case. Since the subgraph of realized adjacencies is necessarily planar, we also consider several planar graph classes (namely stars, trees, outerplanar, and planar graphs), improving upon the known results. For some graph classes, we also describe improvements in the unweighted case, where each adjacency yields the same profit. Finally, we show that the problem is APX-hard on bipartite graphs of bounded maximum degree.

    Targeted interest-driven advertising in cities using Twitter

    Get PDF
    Targeted advertising is a key characteristic of online as well as traditional-media marketing. However it is very limited in outdoor advertising, that is, performing campaigns by means of billboards in public places. The reason is the lack of information about the interests of the particular passersby, except at very imprecise and aggregate demographic or traffic estimates. In this work we propose a methodology for performing targeted outdoor advertising by leveraging the use of social media. In particular, we use the Twitter social network to gather information about users’ degree of interest in given advertising categories and about the common routes that they follow, characterizing in this way each zone in a given city. Then we use our characterization for recommending physical locations for advertising. Given an advertisement category, we estimate the most promising areas to be selected for the placement of an ad that can maximize its targeted effectiveness. We show that our approach is able to select advertising locations better with respect to a baseline reflecting a current ad-placement policy. To the best of our knowledge this is the first work on offline advertising in urban areas making use of (publicly available) data from social networks

    Online Submodular Maximization Problem with Vector Packing Constraint

    Get PDF
    We consider the online vector packing problem in which we have a d dimensional knapsack and items u with weight vectors w_u in R_+^d arrive online in an arbitrary order. Upon the arrival of an item, the algorithm must decide immediately whether to discard or accept the item into the knapsack. When item u is accepted, w_u(i) units of capacity on dimension i will be taken up, for each i in [d]. To satisfy the knapsack constraint, an accepted item can be later disposed of with no cost, but discarded or disposed of items cannot be recovered. The objective is to maximize the utility of the accepted items S at the end of the algorithm, which is given by f(S) for some non-negative monotone submodular function f. For any small constant epsilon > 0, we consider the special case that the weight of an item on every dimension is at most a (1- epsilon) fraction of the total capacity, and give a polynomial-time deterministic O(k / epsilon^2)-competitive algorithm for the problem, where k is the (column) sparsity of the weight vectors. We also show several (almost) tight hardness results even when the algorithm is computationally unbounded. We first show that under the epsilon-slack assumption, no deterministic algorithm can obtain any o(k) competitive ratio, and no randomized algorithm can obtain any o(k / log k) competitive ratio. We then show that for the general case (when epsilon = 0), no randomized algorithm can obtain any o(k) competitive ratio. In contrast to the (1+delta) competitive ratio achieved in Kesselheim et al. [STOC 2014] for the problem with random arrival order of items and under large capacity assumption, we show that in the arbitrary arrival order case, even when |w_u|_infinity is arbitrarily small for all items u, it is impossible to achieve any o(log k / log log k) competitive ratio
    corecore