27,659 research outputs found

    Dynamic resource constrained multi-project scheduling problem with weighted earliness/tardiness costs

    Get PDF
    In this study, a conceptual framework is given for the dynamic multi-project scheduling problem with weighted earliness/tardiness costs (DRCMPSPWET) and a mathematical programming formulation of the problem is provided. In DRCMPSPWET, a project arrives on top of an existing project portfolio and a due date has to be quoted for the new project while minimizing the costs of schedule changes. The objective function consists of the weighted earliness tardiness costs of the activities of the existing projects in the current baseline schedule plus a term that increases linearly with the anticipated completion time of the new project. An iterated local search based approach is developed for large instances of this problem. In order to analyze the performance and behavior of the proposed method, a new multi-project data set is created by controlling the total number of activities, the due date tightness, the due date range, the number of resource types, and the completion time factor in an instance. A series of computational experiments are carried out to test the performance of the local search approach. Exact solutions are provided for the small instances. The results indicate that the local search heuristic performs well in terms of both solution quality and solution time

    A generic framework for video understanding applied to group behavior recognition

    Get PDF
    This paper presents an approach to detect and track groups of people in video-surveillance applications, and to automatically recognize their behavior. This method keeps track of individuals moving together by maintaining a spacial and temporal group coherence. First, people are individually detected and tracked. Second, their trajectories are analyzed over a temporal window and clustered using the Mean-Shift algorithm. A coherence value describes how well a set of people can be described as a group. Furthermore, we propose a formal event description language. The group events recognition approach is successfully validated on 4 camera views from 3 datasets: an airport, a subway, a shopping center corridor and an entrance hall.Comment: (20/03/2012

    The Evolution of Shopping Center Research: A Review and Analysis

    Get PDF
    Retail research has evolved over the past sixty years. Christaller\u27s early work on central place theory, with its simplistic combination of range and threshold has been advanced to include complex consumer shopping patterns and retailer behavior in agglomerated retail centers. Hotelling\u27s seminal research on competition in a spatial duopoly has been realized in the form of comparison shopping in regional shopping centers. The research that has followed Christaller and Hoteling has been as wide as it has been deep, including literature in geography, economics, finance, marketing, and real estate. In combination, the many extensions of central place theory and retail agglomeration economics have clearly enhanced the understanding of both retailer and consumer behavior. In addition to these two broad areas of shopping center research, two more narrowly focused areas of research have emerged. The most recent focus in the literature has been on the positive effects large anchor tenants have on smaller non-anchor tenant sales. These positive effects are referred to as retail demand externalities. Exploring the theoretical basis for the valuation of shopping centers has been another area of interest to researchers. The primary focus of this literature is based in the valuation of current and expected lease contracts

    Sensitivity of multi-product two-stage economic lotsizing models and their dependency on change-over and product cost ratio's

    Get PDF
    This study considers the production and inventory management problem of a two-stage semi-process production system. In case both production stages are physically connected it is obvious that materials are forced to flow. The economic lotsize depends on the holding cost of the end-product and the combined change-over cost of both production stages. On the other hand this 'flow shop' is forced to produce at the speed of the slowest stage. The benefit of this approach is the low amount of Work In Process inventory. When on the other hand, the involved stages are physically disconnected, a stock of intermediates acts as a decoupling point. Typically for the semi-process industry are high change-over costs for the process oriented first stage, which results in large lotsize differences for the different production stages. Using the stock of intermediates as a decoupling point avoids the complexity of synchronising operations but is an additional reason to augment the intermediate stock position. The disadvantage of this model is the high amount of Work-In-Process inventory. This paper proposes the 'synchronised planning model' realising a global optimum instead of the combination of two locally optimised settings. The mathematical model proves (for a two-stage single-product setting) that the optimal two-stage production frequency corresponds with the single EOQ solution for the first stage. A sensitivity study reveals, within these two-stage lotsizing models, the economical cost dependency on product and change-over cost ratio‟s. The purpose of this paper is to understand under which conditions the „joined setup‟ or the „two-stage individual eoq model‟ remain close to the optimal model. Numerical examples prove that the conclusions about the optimal settings remain valid when extending the model to a two-stage multi-product setting. The research reveals that two-stage individually optimized EOQ lotsizing should only be used when the end-product stage has a high added value and small change-over costs, compared to the first stage. Physically connected operations should be used when the end-product stage has a small added value and low change-over costs, or high added value and large change-over costs compared to the first production stage. The paper concludes with suggesting a practical common cycle approach to tackle a two-stage multi-product production and inventory management problem. The common cycle approach brings the benefit of a repetitive and predictable production schedule

    On the use of biased-randomized algorithms for solving non-smooth optimization problems

    Get PDF
    Soft constraints are quite common in real-life applications. For example, in freight transportation, the fleet size can be enlarged by outsourcing part of the distribution service and some deliveries to customers can be postponed as well; in inventory management, it is possible to consider stock-outs generated by unexpected demands; and in manufacturing processes and project management, it is frequent that some deadlines cannot be met due to delays in critical steps of the supply chain. However, capacity-, size-, and time-related limitations are included in many optimization problems as hard constraints, while it would be usually more realistic to consider them as soft ones, i.e., they can be violated to some extent by incurring a penalty cost. Most of the times, this penalty cost will be nonlinear and even noncontinuous, which might transform the objective function into a non-smooth one. Despite its many practical applications, non-smooth optimization problems are quite challenging, especially when the underlying optimization problem is NP-hard in nature. In this paper, we propose the use of biased-randomized algorithms as an effective methodology to cope with NP-hard and non-smooth optimization problems in many practical applications. Biased-randomized algorithms extend constructive heuristics by introducing a nonuniform randomization pattern into them. Hence, they can be used to explore promising areas of the solution space without the limitations of gradient-based approaches, which assume the existence of smooth objective functions. Moreover, biased-randomized algorithms can be easily parallelized, thus employing short computing times while exploring a large number of promising regions. This paper discusses these concepts in detail, reviews existing work in different application areas, and highlights current trends and open research lines

    Using Big Data to Enhance the Bosch Production Line Performance: A Kaggle Challenge

    Full text link
    This paper describes our approach to the Bosch production line performance challenge run by Kaggle.com. Maximizing the production yield is at the heart of the manufacturing industry. At the Bosch assembly line, data is recorded for products as they progress through each stage. Data science methods are applied to this huge data repository consisting records of tests and measurements made for each component along the assembly line to predict internal failures. We found that it is possible to train a model that predicts which parts are most likely to fail. Thus a smarter failure detection system can be built and the parts tagged likely to fail can be salvaged to decrease operating costs and increase the profit margins.Comment: IEEE Big Data 2016 Conferenc
    • 

    corecore