13 research outputs found

    A multi-stage stochastic programming for lot-sizing and scheduling under demand uncertainty

    Get PDF
    A stochastic lot-sizing and scheduling problem with demand uncertainty is studied in this paper. Lot-sizing determines the batch size for each product and scheduling decides the sequence of production. A multi-stage stochastic programming model is developed to minimize overall system costs including production cost, setup cost, inventory cost and backlog cost. We aim to find the optimal production sequence and resource allocation decisions. Demand uncertainty is represented by scenario trees using moment matching technique. Scenario reduction is used to select scenarios with the best representation of original set. A case study based on a manufacturing company has been conducted to illustrate and verify the model. We compared the two-stage stochastic programming model to the multi-stage stochastic programming model. The major motivation to adopt multi-stage stochastic programming models is that it extends the two-stage stochastic programming models by allowing revised decision at each period based on the previous realizations of uncertainty as well as decisions. Stability test and weak out-of-sample test are applied to find an appropriate scenario sample size. By using the multi-stage stochastic programming model, we improved the quality of solution by 10–13%

    Scenario trees and policy selection for multistage stochastic programming using machine learning

    Full text link
    We propose a hybrid algorithmic strategy for complex stochastic optimization problems, which combines the use of scenario trees from multistage stochastic programming with machine learning techniques for learning a policy in the form of a statistical model, in the context of constrained vector-valued decisions. Such a policy allows one to run out-of-sample simulations over a large number of independent scenarios, and obtain a signal on the quality of the approximation scheme used to solve the multistage stochastic program. We propose to apply this fast simulation technique to choose the best tree from a set of scenario trees. A solution scheme is introduced, where several scenario trees with random branching structure are solved in parallel, and where the tree from which the best policy for the true problem could be learned is ultimately retained. Numerical tests show that excellent trade-offs can be achieved between run times and solution quality

    Optimal Capacity Conversion for Product Transitions Under High Service Requirements

    Get PDF
    We consider the capacity planning problem during a product transition in which demand for a new-generation product gradually replaces that for the old product. Capacity for the new product can be acquired both by purchasing new production lines and by converting existing production lines for the old product. Furthermore, in either case, the new product capacity is “retrofitted” to be flexible, i.e., to be able to also produce the old product. This capacity planning problem arises regularly at Intel, which served as the motivating context for this research. We formulate a two-product capacity planning model to determine the equipment purchase and conversion schedule, considering (i) time-varying and uncertain demand, (ii) dedicated and flexible capacity, (iii) inventory and equipment costs, and (iv) a chance-constrained service-level requirement. We develop a solution approach that accounts for the risk-pooling benefit of flexible capacity (a closed-loop planning approach) and compare it with a solution that is similar to Intel's current practice (an open-loop planning approach). We evaluate both approaches with a realistic but disguised example and show that the closed-loop planning solution leads to savings in both equipment and inventory costs and matches more closely the service-level targets for the two products. Our numerical experiments illuminate the cost trade-offs between purchasing new capacity and converting old capacity and between a level capacity plan versus a chase capacity plan.Semiconductor Research Corporation (Grant 2215.001

    Adaptive Two-stage Stochastic Programming with an Application to Capacity Expansion Planning

    Full text link
    Multi-stage stochastic programming is a well-established framework for sequential decision making under uncertainty by seeking policies that are fully adapted to the uncertainty. Often such flexible policies are not desirable, and the decision maker may need to commit to a set of actions for a number of planning periods. Two-stage stochastic programming might be better suited to such settings, where the decisions for all periods are made here-and-now and do not adapt to the uncertainty realized. In this paper, we propose a novel alternative approach, where the stages are not predetermined but part of the optimization problem. Each component of the decision policy has an associated revision point, a period prior to which the decision is predetermined and after which it is revised to adjust to the uncertainty realized thus far. We motivate this setting using the multi-period newsvendor problem by deriving an optimal adaptive policy. We label the proposed approach as adaptive two-stage stochastic programming and provide a generic mixed-integer programming formulation for finite stochastic processes. We show that adaptive two-stage stochastic programming is NP-hard in general. Next, we derive bounds on the value of adaptive two-stage programming in comparison to the two-stage and multi-stage approaches for a specific problem structure inspired by the capacity expansion planning problem. Since directly solving the mixed-integer linear program associated with the adaptive two-stage approach might be very costly for large instances, we propose several heuristic solution algorithms based on the bound analysis. We provide approximation guarantees for these heuristics. Finally, we present an extensive computational study on an electricity generation capacity expansion planning problem and demonstrate the computational and practical impacts of the proposed approach from various perspectives

    Decision making under uncertainties for renewable energy and precision agriculture

    Get PDF
    In this dissertation, mathematical programming models and statistical analysis tools have been formulated and designed to study the strategic and optimal solutions to allocate the resources and manage the risk for the renewable energy and precision agriculture. The dissertation, which consists of four papers, lies at the interface of optimization, simulation, and statistical analysis, with a focus on decision making under uncertainty for biofuel process design, renewable energy supply chain management and precision agriculture. Bio-oil gasification which integrates fast pyrolysis and gasification processes is a relative new conversion technology and this integrated biofuel production pathway has been promoted to take advantage of economies of scale and logistic efficiency. The design of the supply chain networks, especially under uncertainties, is one of the most important decisions faced by the biofuel industry. In the first paper, we proposed a two-stage stochastic programming framework for the biofuel supply chain optimization problem considering uncertainties, including biomass supply availability, technology advancement, and biofuel market price. The results show that the stochastic factors have significant impacts on the decision on fast pyrolysis plant locations, especially when there is insufficient biomass. Also, farmers\u27 participation can have a significant impact on the profitability and robustness of this supply chain design. Another major challenge faced by the cellulosic biofuel industry is that investors are hesitant to take the risk to construct commercial scale production facilities. Techno- economic analysis (TEA) has been widely adopted to overcome this challenge. The optimal facility locations and capacities as well as the logistic flow decisions for biomass supply and biofuel distribution should be incorporated into techno-economic analysis as well. In the second paper, the author aims to provide a new method that integrated the supply chain design into the techno-economic analysis as well by evaluating the economic feasibility of an integrated pathway on biomass pyrolysis and bio-oil gasification. The results indicate that hybrid fast pyrolysis and bio-oil gasification pathway is more suitable for a decentralized supply chain structure while biomass gasification pathway is more suitable for a single centralized facility supply chain structure. Feeding millions of people throughout the world who face hunger every day is a formidable challenge. Precision agriculture has attracted increasing attention in the community of farmland management. Farmland management involves a sequence of planning and decision-making processes, including seed selection and irrigation schedule. In the third paper, a mixed integer programming optimization model is proposed to provide decision support on seed selection and irrigation water allocation for customized precision farmland management. The results show that significant increase of farmers’ annual profit can be achieved by carefully choosing irrigation schedule and type of seed. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture. The effect of limited water on corn grain yield is significant and management decisions are essential to optimize farmers’ profits, particularly under stochastic environment. The fourth paper takes uncertainties such as crop price, irrigation water availability and precipitation amount into consideration. A multi-stage stochastic programming is formulated to evaluate the effects of structure of decision making process on farmers’ income. The case study results indicate multi-stage stochastic programming is a promising way for farmland management under uncertainties and can increase farmers’ income significantly. In order to enhance the data utilization and results interpretation, statistical methods such as Monte-Carlo simulation considering parameter interactions, linear regression analysis, and moment matching method for scenario generation are also applied. The overarching goals of this dissertation is to quantify and manage the uncertainties along the modeling process and provide proper mechanisms that lead to optimal decisions. The outcomes of the research have the potential to accelerate the commercialization of second generation of biofuel and lead to sustainable utilization of water resources. The insights derived from the research contributed to the decision making process under uncertainties

    Strategic Technology Maturation and Insertion (STMI): a requirements guided, technology development optimization process

    Get PDF
    This research presents a Decision Support System (DSS) process solution to a problem faced by Program Managers (PMs) early in a system lifecycle, when potential technologies are evaluated for placement within a system design. The proposed process for evaluation and selection of technologies incorporates computer based Operational Research techniques which automate and optimize key portions of the decision process. This computerized process allows the PM to rapidly form the basis of a Strategic Technology Plan (STP) designed to manage, mature and insert the technologies into the system design baseline and identify potential follow-on incremental system improvements. This process is designated Strategic Technology Maturation and Insertion (STMI). Traditionally, to build this STP, the PM must juggle system performance, schedule, and cost issues and strike a balance of new and old technologies that can be fielded to meet the requirements of the customer. To complicate this juggling skill, the PM is typically confronted with a short time frame to evaluate hundreds of potential technology solutions with thousands of potential interacting combinations within the system design. Picking the best combination of new and established technologies, plus selecting the critical technologies needing maturation investment is a significant challenge. These early lifecycle decisions drive the entire system design, cost and schedule well into production The STMI process explores a formalized and repeatable DSS to allow PMs to systematically tackle the problems with technology evaluation, selection and maturation. It gives PMs a tool to compare and evaluate the entire design space of candidate technology performance, incorporate lifecycle costs as an optimizer for a best value system design, and generate input for a strategic plan to mature critical technologies. Four enabling concepts are described and brought together to form the basis of STMI: Requirements Engineering (RE), Value Engineering (VE), system optimization and Strategic Technology Planning (STP). STMI is then executed in three distinct stages: Pre-process preparation, process operation and optimization, and post-process analysis. A demonstration case study prepares and implements the proposed STMI process in a multi-system (macro) concept down select and a specific (micro) single system design that ties into the macro design level decision

    Long term capacity planning with products' renewal

    Get PDF
    Long Term Capacity Planning (LTCP) consists of deciding the type and amount of capacity of production systems for multiple periods in a long term planning horizon. It involves decisions related to strategic planning, such as buying or selling of production technology, outsourcing, and making tactical decisions regarding capacity level and configuration. Making these kinds of decisions correctly is highly important for three reasons. Firstly, they usually involve a high investment; secondly, once a decision like this is taken, it cannot be changed easily (i.e. they are highly irreversible); thirdly, they affect the performance of the entire system and the decisions that will be possible at a tactical level. If capacity is suboptimal, there will be lost demand (in the present and possibly in the future); if the system is oversized, there will be unused resources, which may represent an economical loss. Long term decisions are typically solved with non-formalized procedures, such as generating and comparing solutions, which do not guarantee an optimal solution. In addition, the characteristics of the long term capacity planning problem make the problem very difficult to solve, especially in cases in which products have a short life cycle. One of the most relevant characteristics is the uncertainty inherent to strategic problems. In this case, uncertainty affects parameters such as demand, product life cycle, available production technology and the economic parameters involved (e.g. prices, costs, bank interests, etc.). Selection of production technology depends on the products being offered by the company, along with factors such as costs and productivity. When a product is renewed, the production technology may not be capable of producing it; or, if it can, the productivity and/or the quality may be poor. Furthermore, renewing a product will affect its demand (cannibalization), as well as the demand and value of the old products. Hence, it is very important to accurately decide the correct time for product renewal. This thesis aims to design a model for solving a long term capacity planning problem with the following main characteristics: (1) short-life cycle products and their renewal, with demand interactions (complementary and competitive products) considered; (2) different capacity options (such as acquisition, renewal, updating, outsourcing and reducing); and (3) tactical decisions (including integration strategic and tactical decisions)

    Facets for Continuous Multi-Mixing Set and Its Generalizations: Strong Cuts for Multi-Module Capacitated Lot-Sizing Problem

    Get PDF
    The research objective of this dissertation is to develop new facet-defining valid inequalities for several new multi-parameter multi-constraint mixed integer sets. These valid inequalities result in cutting planes that significantly improve the efficiency of algorithms for solving mixed integer programming (MIP) problems involving multimodule capacity constraints. These MIPs arise in many classical and modern applications ranging from production planning to cloud computing. The research in this dissertation generalizes cut-generating methods such as mixed integer rounding (MIR), mixed MIR, continuous mixing, n-step MIR, mixed n-step MIR, migling, and n-step mingling, along with various well-known families of cuts for problems such as multi-module capacitated lot-sizing (MMLS), multi-module capacitated facility location (MMFL), and multi-module capacitated network design (MMND) problems. More specifically, in the first step, we introduce a new generalization of the continuous mixing set, referred to as the continuous multi-mixing set, where the coefficients satisfy certain conditions. For each n’ ϵ {1; : : : ; n}, we develop a class of valid inequalities for this set, referred to as the n0-step cycle inequalities, and present their facet-defining properties. We also present a compact extended formulation for this set and an exact separation algorithm to separate over the set of all n’-step cycle inequalities for a given n’ ϵ {1; : : : ; n}. In the next step, we extend the results of the first step to the case where conditions on the coefficients of the continuous multi-mixing set are relaxed. This leads to an extended formulation and a generalization of the n-step cycle inequalities, n ϵ N, for the continuous multi-mixing set with general coefficients. We also show that these inequalities are facet-defining in many cases. In the third step, we further generalize the continuous multi-mixing set (where no conditions are imposed on the coefficients) by incorporating upper bounds on the integer variables. We introduce a compact extended formulation and new families of multi-row cuts for this set, referred to as the mingled n-step cycle inequalities (n ϵ N), through a generalization of the n-step mingling. We also provide an exact separation algorithm to separate over a set of all these inequalities. Furthermore, we present the conditions under which a subset of the mingled n-step cycle inequalities are facet-defining for this set. Finally, in the fourth step, we utilize the results of first step to introduce new families of valid inequalities for MMLS, MMFL, and MMND problems. Our computational results show that the developed cuts are very effective in solving the MMLS instances with two capacity modules, resulting in considerable reduction in the integrality gap, the number of nodes, and total solution time

    Predictive and Prescriptive Analytics for Managing the Impact of Hazards on Power Systems

    Full text link
    Natural hazards and extreme weather events have the potential to cause significant disruptions to the electric power grid. The resulting damages are, in some cases, very expensive and time-consuming to repair and they lead to substantial burdens on both utilities and customers. The frequency of such events has also been increasing over the last 30 years and several studies show that both the number and intensity of severe weather events will increase due to global warming and climate change. An important part of managing weather-induced power outages is being properly prepared for them, and this is tied in with broader goals of enhancing power system resilience. Inspired by these challenges, this thesis focuses on developing data-driven frameworks under uncertainty for predictive and prescriptive analytics in order to address the resiliency challenges of power systems. In particular, the primary aims of this dissertation are to: 1. Develop a series of predictive models that can accurately estimate the probability distribution of power outages in advance of a storm. 2. Develop a crew coordination planning model to allocate repair crews to areas affected by hazards in response to the uncertain predicted outages. The first chapter introduces storm outage management and explains the main objectives of this thesis in detail. In the second chapter, I develop a novel two-stage predictive modeling framework to overcome the zero-inflation issue that is seen in most outage related data. The proposed model accurately estimates customer interruptions in terms of probability distributions to better address inherent stochasticity in predictions. In the next chapter, I develop a new adaptive statistical learning approach based on Bayesian model averaging to formulate model uncertainty and develop a model that is able to adapt to changing conditions and data over time. The forth chapter uses Bayesian belief network to model the stochastic interconnection between various meteorological factors and physical damage to different power system assets. Finally, in chapter five, I develop a new multi-stage stochastic program model to allocate and relocate repair crews in impacted areas during an extreme weather event to restore power as quickly as possible with minimum costs. This research was conducted in collaboration with multiple power utility companies, and some of the models and algorithms developed in this thesis are already implemented in those companies and utilized by their employees. Based on actual data from these companies, I provide evidence that significant improvements have been achieved by my models.PHDIndustrial & Operations EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/168024/1/ekabir_1.pd
    corecore