224 research outputs found

    On the probabilistic min spanning tree Problem

    Get PDF
    We study a probabilistic optimization model for min spanning tree, where any vertex vi of the input-graph G(V,E) has some presence probability pi in the final instance G′ ⊂ G that will effectively be optimized. Suppose that when this “real” instance G′ becomes known, a spanning tree T, called anticipatory or a priori spanning tree, has already been computed in G and one can run a quick algorithm (quicker than one that recomputes from scratch), called modification strategy, that modifies the anticipatory tree T in order to fit G ′. The goal is to compute an anticipatory spanning tree of G such that, its modification for any G ′ ⊆ G is optimal for G ′. This is what we call probabilistic min spanning tree problem. In this paper we study complexity and approximation of probabilistic min spanning tree in complete graphs under two distinct modification strategies leading to different complexity results for the problem. For the first of the strategies developed, we also study two natural subproblems of probabilistic min spanning tree, namely, the probabilistic metric min spanning tree and the probabilistic min spanning tree 1,2 that deal with metric complete graphs and complete graphs with edge-weights either 1, or 2, respectively

    Quantifying Opportunity Costs in Sequential Transportation Auctions for Truckload Acquisition

    Get PDF
    The principal focus of this research is to quantify opportunity costs in sequential transportation auctions. This paper focuses on the study a transportation marketplace with time-sensitive truckload pickup-and-delivery requests. In this paper, two carriers compete for service requests; each arriving service request triggers an auction where carriers compete with each other to win the right of servicing the load. An expression to evaluate opportunity costs is derived. This paper shows that the impact of evaluating opportunity costs is dependent on the competitive market setting. A simulation framework is used to evaluate different strategies. Some results and the overall simulation framework are also discussed

    Scheduling over Scenarios on Two Machines

    Get PDF
    We consider scheduling problems over scenarios where the goal is to find a single assignment of the jobs to the machines which performs well over all possible scenarios. Each scenario is a subset of jobs that must be executed in that scenario and all scenarios are given explicitly. The two objectives that we consider are minimizing the maximum makespan over all scenarios and minimizing the sum of the makespans of all scenarios. For both versions, we give several approximation algorithms and lower bounds on their approximability. With this research into optimization problems over scenarios, we have opened a new and rich field of interesting problems.Comment: To appear in COCOON 2014. The final publication is available at link.springer.co

    On Conceptually Simple Algorithms for Variants of Online Bipartite Matching

    Full text link
    We present a series of results regarding conceptually simple algorithms for bipartite matching in various online and related models. We first consider a deterministic adversarial model. The best approximation ratio possible for a one-pass deterministic online algorithm is 1/21/2, which is achieved by any greedy algorithm. D\"urr et al. recently presented a 22-pass algorithm called Category-Advice that achieves approximation ratio 3/53/5. We extend their algorithm to multiple passes. We prove the exact approximation ratio for the kk-pass Category-Advice algorithm for all k1k \ge 1, and show that the approximation ratio converges to the inverse of the golden ratio 2/(1+5)0.6182/(1+\sqrt{5}) \approx 0.618 as kk goes to infinity. The convergence is extremely fast --- the 55-pass Category-Advice algorithm is already within 0.01%0.01\% of the inverse of the golden ratio. We then consider a natural greedy algorithm in the online stochastic IID model---MinDegree. This algorithm is an online version of a well-known and extensively studied offline algorithm MinGreedy. We show that MinDegree cannot achieve an approximation ratio better than 11/e1-1/e, which is guaranteed by any consistent greedy algorithm in the known IID model. Finally, following the work in Besser and Poloczek, we depart from an adversarial or stochastic ordering and investigate a natural randomized algorithm (MinRanking) in the priority model. Although the priority model allows the algorithm to choose the input ordering in a general but well defined way, this natural algorithm cannot obtain the approximation of the Ranking algorithm in the ROM model

    Relaxing the Irrevocability Requirement for Online Graph Algorithms

    Get PDF
    Online graph problems are considered in models where the irrevocability requirement is relaxed. Motivated by practical examples where, for example, there is a cost associated with building a facility and no extra cost associated with doing it later, we consider the Late Accept model, where a request can be accepted at a later point, but any acceptance is irrevocable. Similarly, we also consider a Late Reject model, where an accepted request can later be rejected, but any rejection is irrevocable (this is sometimes called preemption). Finally, we consider the Late Accept/Reject model, where late accepts and rejects are both allowed, but any late reject is irrevocable. For Independent Set, the Late Accept/Reject model is necessary to obtain a constant competitive ratio, but for Vertex Cover the Late Accept model is sufficient and for Minimum Spanning Forest the Late Reject model is sufficient. The Matching problem has a competitive ratio of 2, but in the Late Accept/Reject model, its competitive ratio is 3/2

    Solving Multi-choice Secretary Problem in Parallel: An Optimal Observation-Selection Protocol

    Full text link
    The classical secretary problem investigates the question of how to hire the best secretary from nn candidates who come in a uniformly random order. In this work we investigate a parallel generalizations of this problem introduced by Feldman and Tennenholtz [14]. We call it shared QQ-queue JJ-choice KK-best secretary problem. In this problem, nn candidates are evenly distributed into QQ queues, and instead of hiring the best one, the employer wants to hire JJ candidates among the best KK persons. The JJ quotas are shared by all queues. This problem is a generalized version of JJ-choice KK-best problem which has been extensively studied and it has more practical value as it characterizes the parallel situation. Although a few of works have been done about this generalization, to the best of our knowledge, no optimal deterministic protocol was known with general QQ queues. In this paper, we provide an optimal deterministic protocol for this problem. The protocol is in the same style of the 1e1\over e-solution for the classical secretary problem, but with multiple phases and adaptive criteria. Our protocol is very simple and efficient, and we show that several generalizations, such as the fractional JJ-choice KK-best secretary problem and exclusive QQ-queue JJ-choice KK-best secretary problem, can be solved optimally by this protocol with slight modification and the latter one solves an open problem of Feldman and Tennenholtz [14]. In addition, we provide theoretical analysis for two typical cases, including the 1-queue 1-choice KK-best problem and the shared 2-queue 2-choice 2-best problem. For the former, we prove a lower bound 1O(ln2KK2)1-O(\frac{\ln^2K}{K^2}) of the competitive ratio. For the latter, we show the optimal competitive ratio is 0.372\approx0.372 while previously the best known result is 0.356 [14].Comment: This work is accepted by ISAAC 201

    Relationships between fluvial evolution and karstification related to climatic, tectonic and eustatic forcing in temperate regions

    Get PDF
    This paper reviews the diversity of relationships between river evolution and karstogenesis. It also underlines the fundamental role of numerical dating methods (e.g. cosmogenic nuclides) applied to sedimentary sequences in tiered cave passages as they have provided new insights into these complex interactions. Although karst terrain is widespread worldwide, we focus on European karst catchments, where the sedimentary records are especially well preserved. We review the recent dating of fluvial sediments and speleothems, to examine the timing of karstification, incision and deposition in cave levels. The most complete alluvial records occur in tectonically uplifted high mountains where some of the oldest sediment fills date to the Miocene. Evidence indicates that not only uplift, but also climatic conditions and fluvial dynamics (e.g. knickpoint retreat, increased channel flow and/or sediment load, and stream piracies) can play a major role in speleogenesis and geomorphological evolution. In evaporite rocks, speleogenesis is characterized by rapid dissolution and subsidence. In European catchments, gypsum cave development largely occurred during cold climate periods, while limestone caves formed during warm interglacial or interstadial phases. Our synthesis is used to propose four models of fluvial and karst evolution, and highlight perspectives for further research

    Partitioning of Mg, Sr, Ba and U into a subaqueous calcite speleothem

    Get PDF
    The trace-element geochemistry of speleothems is becoming increasingly used for reconstructing palaeoclimate, with a particular emphasis on elements whose concentrations vary according to hydrological conditions at the cave site (e.g. Mg, Sr, Ba and U). An important step in interpreting trace-element abundances is understanding the underlying processes of their incorporation. This includes quantifying the fractionation between the solution and speleothem carbonate via partition coefficients (where the partitioning (D) of element X (DX) is the molar ratio [X/Ca] in the calcite divided by the molar ratio [X/Ca] in the parent water) and evaluating the degree of spatial variability across time-constant speleothem layers. Previous studies of how these elements are incorporated into speleothems have focused primarily on stalagmites and their source waters in natural cave settings, or have used synthetic solutions under cave-analogue laboratory conditions to produce similar dripstones. However, dripstones are not the only speleothem types capable of yielding useful palaeoclimate information. In this study, we investigate the incorporation of Mg, Sr, Ba and U into a subaqueous calcite speleothem (CD3) growing in a natural cave pool in Italy. Pool-water measurements extending back 15 years reveal a remarkably stable geochemical environment owing to the deep cave setting, enabling the calculation of precise solution [X/Ca]. We determine the trace element variability of ‘modern’ subaqueous calcite from a drill core taken through CD3 to derive DMg, DSr, DBa and DU then compare these with published cave, cave-analogue and seawater-analogue studies. The DMg for CD3 is anomalously high (0.042 ± 0.002) compared to previous estimates at similar temperatures (∼8 °C). The DSr (0.100 ± 0.007) is similar to previously reported values, but data from this study as well as those from Tremaine and Froelich (2013) and Day and Henderson (2013) suggest that [Na/Sr] might play an important role in Sr incorporation through the potential for Na to outcompete Sr for calcite non-lattice sites. DBa in CD3 (0.086 ± 0.008) is similar to values derived by Day and Henderson (2013) under cave-analogue conditions, whilst DU (0.013 ± 0.002) is almost an order of magnitude lower, possibly due to the unusually slow speleothem growth rates (<1 μm a−1), which could expose the crystal surfaces to leaching of uranyl carbonate. Finally, laser-ablation ICP-MS analysis of the upper 7 μm of CD3, regarded as ‘modern’ for the purposes of this study, reveals considerable heterogeneity, particularly for Sr, Ba and U, which is potentially indicative of compositional zoning. This reinforces the need to conduct 2D mapping and/or multiple laser passes to capture the range of time-equivalent elemental variations prior to palaeoclimate interpretation

    Large scale stochastic inventory routing problems with split delivery and service level constraints

    Get PDF
    A stochastic inventory routing problem (SIRP) is typically the combination of stochastic inventory control problems and NP-hard vehicle routing problems, which determines delivery volumes to the customers that the depot serves in each period, and vehicle routes to deliver the volumes. This paper aims to solve a large scale multi-period SIRP with split delivery (SIRPSD) where a customer’s delivery in each period can be split and satisfied by multiple vehicle routes if necessary. This paper considers SIRPSD under the multi-criteria of the total inventory and transportation costs, and the service levels of customers. The total inventory and transportation cost is considered as the objective of the problem to minimize, while the service levels of the warehouses and the customers are satisfied by some imposed constraints and can be adjusted according to practical requests. In order to tackle the SIRPSD with notorious computational complexity, we first propose an approximate model, which significantly reduces the number of decision variables compared to its corresponding exact model. We then develop a hybrid approach that combines the linearization of nonlinear constraints, the decomposition of the model into sub-models with Lagrangian relaxation, and a partial linearization approach for a sub model. A near optimal solution of the model found by the approach is used to construct a near optimal solution of the SIRPSD. Randomly generated instances of the problem with up to 200 customers and 5 periods and about 400 thousands decision variables where half of them are integer are examined by numerical experiments. Our approach can obtain high quality near optimal solutions within a reasonable amount of computation time on an ordinary PC
    corecore