1,488 research outputs found

    Stochastic programming for City Logistics: new models and methods

    Get PDF
    The need for mobility that emerged in the last decades led to an impressive increase in the number of vehicles as well as to a saturation of transportation infrastructures. Consequently, traffic congestion, accidents, transportation delays, and polluting emissions are some of the most recurrent concerns transportation and city managers have to deal with. However, just building new infrastructures might be not sustainable because of their cost, the land usage, which usually lacks in metropolitan regions, and their negative impact on the environment. Therefore, a different way of improving the performance of transportation systems while enhancing travel safety has to be found in order to make people and good transportation operations more efficient and support their key role in the economic development of either a city or a whole country. The concept of City Logistics (CL) is being developed to answer to this need. Indeed, CL focus on reducing the number of vehicles operating in the city, controlling their dimension and characteristics. CL solutions do not only improve the transportation system but the whole logistics system within an urban area, trying to integrate interests of the several. This global view challenges researchers to develop planning models, methods and decision support tools for the optimization of the structures and the activities of the transportation system. In particular, this leads researchers to the definition of strategic and tactical problems belonging to well-known problem classes, including network design problem, vehicle routing problem (VRP), traveling salesman problem (TSP), bin packing problem (BPP), which typically act as sub-problems of the overall CL system optimization. When long planning horizons are involved, these problems become stochastic and, thus, must explicitly take into account the different sources of uncertainty that can affect the transportation system. Due to these reasons and the large-scale of CL systems, the optimization problems arising in the urban context are very challenging. Their solution requires investigations in mathematical and combinatorial optimization methods as well as the implementation of efficient exact and heuristic algorithms. However, contributions answering these challenges are still limited number. This work contributes in filling this gap in the literature in terms of both modeling framework for new planning problems in CL context and developing new and effective heuristic solving methods for the two-stage formulation of these problems. Three stochastic problems are proposed in the context of CL: the stochastic variable cost and size bin packing problem (SVCSBPP), the multi-handler knapsack problem under uncertainty (MHKPu) and the multi-path traveling salesman problem with stochastic travel times (mpTSPs). The SVCSBPP arises in supply-chain management, in which companies outsource the logistics activities to a third-party logistic firm (3PL). The procurement of sufficient capacity, expressed in terms of vehicles, containers or space in a warehouse for varying periods of time to satisfy the demand plays a crucial role. The SVCSBPP focuses on the relation between a company and its logistics capacity provider and the tactical-planning problem of determining the quantity of capacity units to secure for the next period of activity. The SVCSBPP is the first attempt to introduce a stochastic variant of the variable cost and size bin packing problem (VCSBPP) considering not only the uncertainty on the demand to deliver, but also on the renting cost of the different bins and their availability. A large number of real-life situations can be satisfactorily modeled as a MHKPu, in particular in the last mile delivery. Last mile delivery may involve different sequences of consolidation operations, each handled by different workers with different skill levels and reliability. The improper management of consolidation operations can cause delay in the operations reducing the overall profit of the deliveries. Thus, given a set of potential logistics handlers and a set of items to deliver, characterized by volume and random profit, the MHKPu consists in finding a subset of items which maximizes the expected total profit. The profit is given by the sum of a deterministic profit and a stochastic profit oscillation, with unknown probability distribution, due to the random handling costs of the handlers.The mpTSPs arises mainly in City Logistics applications. Cities offer several services, such as garbage collection, periodic delivery of goods in urban grocery distribution and bike sharing services. These services require the planning of fixed and periodic tours that will be used from one to several weeks. However, the enlarged time horizon as well as strong dynamic changes in travel times due to traffic congestion and other nuisances typical of the urban transportation induce the presence of multiple paths with stochastic travel times. Given a graph characterized by a set of nodes connected by arcs, mpTSPs considers that, for every pair of nodes, multiple paths between the two nodes are present. Each path is characterized by a random travel time. Similarly to the standard TSP, the aim of the problem is to define the Hamiltonian cycle minimizing the expected total cost. These planning problems have been formulated as two-stage integer stochastic programs with recourse. Discretization methods are usually applied to approximate the probability distribution of the random parameters. The resulting approximated program becomes a deterministic linear program with integer decision variables of generally very large dimensions, beyond the reach of exact methods. Therefore, heuristics are required. For the MHKPu, we apply the extreme value theory and derive a deterministic approximation, while for the SVCSBPP and the mpTSPs we introduce effective and accurate heuristics based on the progressive hedging (PH) ideas. The PH mitigates the computational difficulty associated with large problem instances by decomposing the stochastic program by scenario. When effective heuristic techniques exist for solving individual scenario, that is the case of the SVCSBPP and the mpTSPs, the PH further reduces the computational effort of solving scenario subproblems by means of a commercial solver. In particular, we propose a series of specific strategies to accelerate the search and efficiently address the symmetry of solutions, including an aggregated consensual solution, heuristic penalty adjustments, and a bundle fixing technique. Yet, although solution methods become more powerful, combinatorial problems in the CL context are very large and difficult to solve. Thus, in order to significantly enhance the computational efficiency, these heuristics implement parallel schemes. With the aim to make a complete analysis of the problems proposed, we perform extensive numerical experiments on a large set of instances of various dimensions, including realistic setting derived by real applications in the urban area, and combinations of different levels of variability and correlations in the stochastic parameters. The campaign includes the assessment of the efficiency of the meta-heuristic, the evaluation of the interest to explicitly consider uncertainty, an analysis of the impact of problem characteristics, the structure of solutions, as well as an evaluation of the robustness of the solutions when used as decision tool. The numerical analysis indicates that the stochastic programs have significant effects in terms of both the economic impact (e.g. cost reduction) and the operations management (e.g. prediction of the capacity needed by the firm). The proposed methodologies outperform the use of commercial solvers, also when small-size instances are considered. In fact, they find good solutions in manageable computing time. This makes these heuristics a strategic tool that can be incorporated in larger decision support systems for CL

    Multi crteria decision making and its applications : a literature review

    Get PDF
    This paper presents current techniques used in Multi Criteria Decision Making (MCDM) and their applications. Two basic approaches for MCDM, namely Artificial Intelligence MCDM (AIMCDM) and Classical MCDM (CMCDM) are discussed and investigated. Recent articles from international journals related to MCDM are collected and analyzed to find which approach is more common than the other in MCDM. Also, which area these techniques are applied to. Those articles are appearing in journals for the year 2008 only. This paper provides evidence that currently, both AIMCDM and CMCDM are equally common in MCDM

    Towards Exploratory Reformulation of Constraint Models

    Full text link
    It is well established that formulating an effective constraint model of a problem of interest is crucial to the efficiency with which it can subsequently be solved. Following from the observation that it is difficult, if not impossible, to know a priori which of a set of candidate models will perform best in practice, we envisage a system that explores the space of models through a process of reformulation from an initial model, guided by performance on a set of training instances from the problem class under consideration. We plan to situate this system in a refinement-based approach, where a user writes a constraint specification describing a problem above the level of abstraction at which many modelling decisions are made. In this position paper we set out our plan for an exploratory reformulation system, and discuss progress made so far.Comment: 13 pages, 6 figure

    Towards exploratory reformulation of constraint models

    Get PDF
    Funding: Ian Miguel: EPSRC grant EP/V027182/1; Christopher Stone: EPSRC grant EP/V027182/1.It is well established that formulating an effective constraint model of a problem of interest is crucial to the efficiency with which it can subsequently be solved. Following from the observation that it is difficult, if not impossible, to know a priori which of a set of candidate models will perform best in practice, we envisage a system that explores the space of models through a process of reformulation from an initial model, guided by performance on a set of training instances from the problem class under consideration. We plan to situate this system in a refinement-based approach, where a user writes a constraint specification describing a problem above the level of abstraction at which many modelling decisions are made. In this position paper we set out our plan for an exploratory reformulation system, and discuss progress made so far.PostprintPeer reviewe

    Online fulfillment: f-warehouse order consolidation and bops store picking problems

    Get PDF
    Fulfillment of online retail orders is a critical challenge for retailers since the legacy infrastructure and control methods are ill suited for online retail. The primary performance goal of online fulfillment is speed or fast fulfillment, requiring received orders to be shipped or ready for pickup within a few hours. Several novel numerical problems characterize fast fulfillment operations and this research solves two such problems. Order fulfillment warehouses (F-Warehouses) are a critical component of the physical internet behind online retail supply chains. Two key distinguishing features of an F-Warehouse are (i) Explosive Storage Policy – A unique item can be stored simultaneously in multiple bin locations dispersed through the warehouse, and (ii) Commingled Bins – A bin can stock several different items simultaneously. The inventory dispersion profile of an item is therefore temporal and non-repetitive. The order arrival process is continuous, and each order consists of one or more items. From the set of pending orders, efficient picking lists of 10-15 items are generated. A picklist of items is collected in a tote, which is then transported to a packaging station, where items belonging to the same order are consolidated into a shipment package. There are multiple such stations. This research formulates and solves the order consolidation problem. At any time, a batch of totes are to be processed through several available order packaging stations. Tote assignment to a station will determine whether an order will be shipped in a single package or multiple packages. Reduced shipping costs are a key operational goal of an online retailer, and the number of packages is a determining factor. The decision variable is which station a tote should be assigned to, and the performance objective is to minimize the number of packages and balance the packaging station workload. This research first formulates the order consolidation problem as a mixed integer programming model, and then develops two fast heuristics (#1 and #2) plus two clustering algorithm derived solutions. For small problems, the heuristic #2 is on average within 4.1% of the optimal solution. For larger problems heuristic #2 outperforms all other algorithms. Performance behavior of heuristic #2 is further studied as a function of several characteristics. S-Strategy fulfillment is a store-based solution for fulfilling online customer orders. The S-Strategy is driven by two key motivations, first, retailers have a network of stores where the inventory is already dispersed, and second, the expectation is that forward positioned inventory could be faster and more economical than a warehouse based F-Strategy. Orders are picked from store inventory and then the customer picks up from the store (BOPS). A BOPS store has two distinguishing features (i) In addition to shelf stock, the layout includes a space constrained back stock of selected items, and (ii) a set of dedicated pickers who are scheduled to fulfill orders. This research solves two BOFS related problems: (i) Back stock strategy: Assignment of items located in the back stock and (ii) Picker scheduling: Effect of numbers of picker and work hours. A continuous flow of incoming orders is assumed for both problems and the objective is fulfillment time and labor cost minimization. For the back-stock problem an assignment rule based on order frequency, forward location and order basket correlations achieves a 17.6% improvement over a no back-stock store, while a rule based only on order frequency achieves a 12.4 % improvement. Additional experiments across a range of order baskets are reported

    La métaheuristique CAT pour le design de réseaux logistiques déterministes et stochastiques

    Get PDF
    De nos jours, les entreprises d’ici et d’ailleurs sont confrontées à une concurrence mondiale sans cesse plus féroce. Afin de survivre et de développer des avantages concurrentiels, elles doivent s’approvisionner et vendre leurs produits sur les marchés mondiaux. Elles doivent aussi offrir simultanément à leurs clients des produits d’excellente qualité à prix concurrentiels et assortis d’un service impeccable. Ainsi, les activités d’approvisionnement, de production et de marketing ne peuvent plus être planifiées et gérées indépendamment. Dans ce contexte, les grandes entreprises manufacturières se doivent de réorganiser et reconfigurer sans cesse leur réseau logistique pour faire face aux pressions financières et environnementales ainsi qu’aux exigences de leurs clients. Tout doit être révisé et planifié de façon intégrée : sélection des fournisseurs, choix d’investissements, planification du transport et préparation d’une proposition de valeur incluant souvent produits et services au fournisseur. Au niveau stratégique, ce problème est fréquemment désigné par le vocable « design de réseau logistique ». Une approche intéressante pour résoudre ces problématiques décisionnelles complexes consiste à formuler et résoudre un modèle mathématique en nombres entiers représentant la problématique. Plusieurs modèles ont ainsi été récemment proposés pour traiter différentes catégories de décision en matière de design de réseau logistique. Cependant, ces modèles sont très complexes et difficiles à résoudre, et même les solveurs les plus performants échouent parfois à fournir une solution de qualité. Les travaux développés dans cette thèse proposent plusieurs contributions. Tout d’abord, un modèle de design de réseau logistique incorporant plusieurs innovations proposées récemment dans la littérature a été développé; celui-ci intègre les dimensions du choix des fournisseurs, la localisation, la configuration et l’assignation de mission aux installations (usines, entrepôts, etc.) de l’entreprise, la planification stratégique du transport et la sélection de politiques de marketing et d’offre de valeur au consommateur. Des innovations sont proposées au niveau de la modélisation des inventaires ainsi que de la sélection des options de transport. En deuxième lieu, une méthode de résolution distribuée inspirée du paradigme des systèmes multi-agents a été développée afin de résoudre des problèmes d’optimisation de grande taille incorporant plusieurs catégories de décisions. Cette approche, appelée CAT (pour collaborative agent teams), consiste à diviser le problème en un ensemble de sous-problèmes, et assigner chacun de ces sous-problèmes à un agent qui devra le résoudre. Par la suite, les solutions à chacun de ces sous-problèmes sont combinées par d’autres agents afin d’obtenir une solution de qualité au problème initial. Des mécanismes efficaces sont conçus pour la division du problème, pour la résolution des sous-problèmes et pour l’intégration des solutions. L’approche CAT ainsi développée est utilisée pour résoudre le problème de design de réseaux logistiques en univers certain (déterministe). Finalement, des adaptations sont proposées à CAT permettant de résoudre des problèmes de design de réseaux logistiques en univers incertain (stochastique)

    Closed-loop supply chain network design: Case of durable products

    Get PDF
    Closed loop supply chains comprise, in addition to the conventional forward flows from suppliers to end-users, a reverse flow of products, components, and materials from end-users to the manufacturers and secondary markets. Designing a closed-loop supply chain is a strategic level planning which considerably impacts on tactical and operational performance of the supply chain. It refers to the decisions taken on the location of facilities involved in the supply chain network along with the management of the physical flows associated with forward and product recovery channels. Our problem of interest is mainly motivated by the case of durable products including but not limited to large household appliances, computers, photocopying equipment, and aircraft engines. Such category of products has a modular structure, composed of independent components. As opposed to simple structured products, e.g., printer cartridges, that can only be recycled, each of the components in the reverse bill of materials of durable products can be recovered by a particular type of recovery process. Besides, durable products share a long life cycle characteristic which indeed makes designing their CLSC networks more complicated. In this thesis, in keeping with the abovementioned motivation, we focus on designing closed-loop and reverse supply chains in the context of durable products that are of various quality conditions. The recovery decisions for product return include remanufacturing, part harvesting, bulk recycling, material recycling, and landfilling/incineration. Moreover, we take into account environmental concerns regarding the harmful impacts of used products in the closed-loop supply chain planning. As the closed-loop supply chains typically encounter uncertainty in quality and quantity of the profitable return stream, we further aim to consider the impact of uncertainty in designing the recovery network. For such purposes, in the first phase, we address a closed-loop supply chain planning problem in the context of durable products with generic modular structures. The problem is formulated as a mixed-integer programming model which is then solved by an accelerated Benders decomposition-based algorithm. The performance of the proposed decomposition approach is enhanced through incorporating algorithmic features including valid inequalities, non-dominated optimality cuts, and local branching strategies. Next, in the second phase, we propose a precise approach to model the uncertain quality status of returns, in which the availability of each component in the reverse bill of materials is modeled as discrete scenarios. We propose a two-stage stochastic programming model to address this problem setting. Then, since the cardinality of the scenario set grows exponentially with the number of involved components, we detail on a scenario reduction scheme to alleviate the computational burden of the proposed model. The stochastic problem is solved by a L-shaped algorithm enhanced through valid inequalities and Pareto-optimal cuts. Finally, we investigate designing a dynamic reverse supply chain where the quantity of the return flows is uncertain. We introduce a multi-stage stochastic programming model and develop a heuristic inspired by scenario clustering decomposition scheme as the solution method. It revolves around decomposing the scenario tree into smaller sub-trees which consequently yields a number of sub-models in accordance with sub-trees. The resulting sub-models are then coordinated by Lagrangian penalty terms. On account of the fact that each sub-model per se is a hard to solve problem, a Benders decomposition-based algorithm is proposed to solve sub-models

    Models, methods and algorithms for supply chain planning

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.An outline of supply chains and differences in the problem types is given. The motivation for a generic framework is discussed and explored. A conceptual model is presented along with it application to real world situations; and from this a database model is developed. A MIP and CP implementations are presented; along with alternative formulation which can be use to solve the problems. A local search solution algorithm is presented and shown to have significant benefits. Problem instances are presented which are used to validate the generic models, including a large manufacture and distribution problem. This larger problem instance is not only used to explore the implementation of the models presented, but also to explore the practically of the use of alternative formulation and solving techniques within the generic framework and the effectiveness of such methods including the neighbourhood search solving method. A stochastic dimension to the generic framework is explored, and solution techniques for this extension are explored, demonstrating the use of solution analysis to allow problem simplification and better solutions to be found. Finally the local search algorithm is applied to the larger models that arise from inclusion of scenarios, and the methods is demonstrated to be powerful for finding solutions for these large model that were insoluble using the MIP on the same hardware

    How the Eurozone disempowers trade unions: the political economy of competitive internal devaluation

    Get PDF
    The marginalization of trade unions was a notable feature of the sovereign debt crisis in the Eurozone periphery. However, governments have recently imposed liberalizing reforms against union protests in the Eurozone core too. We argue that organized labour loses influence across the core-periphery divide because the 'new economic governance' puts national governments under enhanced pressure to compete against each other on wage and labour market flexibility - a process known as competitive internal devaluation. The article illustrates this argument through comparative quantitative indicators of liberalization and qualitative process-tracing in three core countries. Whereas Germany's outstanding competitiveness position allowed its unions to extract significant concessions, their counterparts in France and Finland faced unprecedented defeats from governments aiming to restore economic growth by closing down the competitiveness gap to Germany. Our findings highlight the class power implications of the Eurozone's reliance on the labour market as the main economic adjustment variable
    • …
    corecore