8,630 research outputs found
Betting on Death and Capital Markets in Retirement: A Shortfall Risk Analysis of Life Annuities versus Phased Withdrawal Plans
How might retirees consider deploying the retirement assets accumulated in a defined contribution pension plan? One possibility would be to purchase an immediate annuity. Another approach, called the âphased withdrawalâ strategy in the literature, would have the retiree invest his funds and then withdraw some portion of the account annually. Using this second tactic, the withdrawal rate might be determined according to a fixed benefit level payable until the retiree dies or the funds run out, or it could be set using a variable formula, where the retiree withdraws funds according to a rule linked to life expectancy. Using a range of data consistent with the German experience, we evaluate several alternative designs for phased withdrawal strategies, allowing for endogenous asset allocation patterns, and also allowing the worker to make decisions both about when to retire and when to switch to an annuity. We show that one particular phased withdrawal rule is appealing since it offers relatively low expected shortfall risk, good expected payouts for the retiree during his life, and some bequest potential for the heirs. We also find that unisex mortality tables if used for annuity pricing can make womenâs expected shortfalls higher, expected benefits higher, and bequests lower under a phased withdrawal program. Finally, we show that delayed annuitization can be appealing since it provides higher expected benefits with lower expected shortfalls, at the cost of somewhat lower anticipated bequests.
Generating robust and stable machine schedules from a proactive standpoint
Ankara : The Department of Industrial Engineering and the Institute of Engineering and Science of Bilkent University, 2009.Thesis (Ph. D.) -- Bilkent University, 2009.Includes bibliographical references leaves 117-121.In practice, scheduling systems are subject to considerable uncertainty in highly
dynamic operating environments. The ability to cope with uncertainty in the
scheduling process is becoming an increasingly important issue. In this thesis we take
a proactive approach to generate robust and stable schedules for the environments
with two sources of uncertainty: processing time variability and machine breakdowns.
The information about the uncertainty is modeled using cumulative distribution
functions and probability theory is utilized to derive inferences.
We first focus on the single machine environment. We define two robustness
(expected total flow time and expected total tardiness) and three stability (the sum of
the squared and absolute differences of the job completion times and the sum of the
variances of the realized completion times) measures. We identify special cases for
which the measures can be optimized without much difficulty. We develop a
dominance rule and two lower bounds for one of the robustness measures, which are
employed in a branch-and-bound algorithm to solve the problem exactly. We also
propose a beam-search heuristic to solve large problems for all five measures. We
provide extensive discussion of our numerical results.
Next, we study the problem of optimizing both robustness and stability
simultaneously. We generate the set of all Pareto optimal points via -constraint
method. We formulate the sub-problems required by the method and establish their
computational complexity status. Two variants of the method that works with only a
single type of sub-problem are also considered. A dominance rule and alternative ways to enforce the rule to strengthen one of these versions are discussed. The
performance of the proposed technique is evaluated with an experimental study. An
approach to limit the total number of generated points while keeping their spread
uniform is also proposed.
Finally, we consider the problem of generating stable schedules in a job shop
environment with processing time variability and random machine breakdowns. The
stability measure under consideration is the sum of the variances of the realized
completion times. We show that the problem is not in the class NP. Hence, a
surrogate stability measure is developed to manage the problem. This version of the
problem is proven to be NP-hard even without machine breakdowns. Two branchand-bound
algorithms are developed for this case. A beam-search and a tabu-search
based two heuristic algorithms are developed to handle realistic size problems with
machine breakdowns. The results of extensive computational experiments are also
provided.Gören, SelçukPh.D
Algorithms for Scheduling Problems
This edited book presents new results in the area of algorithm development for different types of scheduling problems. In eleven chapters, algorithms for single machine problems, flow-shop and job-shop scheduling problems (including their hybrid (flexible) variants), the resource-constrained project scheduling problem, scheduling problems in complex manufacturing systems and supply chains, and workflow scheduling problems are given. The chapters address such subjects as insertion heuristics for energy-efficient scheduling, the re-scheduling of train traffic in real time, control algorithms for short-term scheduling in manufacturing systems, bi-objective optimization of tortilla production, scheduling problems with uncertain (interval) processing times, workflow scheduling for digital signal processor (DSP) clusters, and many more
Optimization of schedule robustness and stability under random machine breakdowns and processing time variability
In practice, scheduling systems are subject to considerable uncertainty in highly dynamic operating environments. The ability to cope with uncertainty in the scheduling process is becoming an increasingly important issue. This paper takes a proactive scheduling approach to study scheduling problems with two sources of uncertainty: processing time variability and machine breakdowns. Two robustness (expected total flow time and expected total tardiness) and three stability (the sum of the squared and absolute differences of the job completion times and the sum of the variances of the realized completion times) measures are defined. Special cases for which the measures can be easily optimized are identified. A dominance rule and two lower bounds for one of the robustness measures are developed and subseqently used in a branch-and-bound algorithm to solve the problem exactly. A beam search heuristic is also proposed to solve large problems for all five measures. The computational results show that the beam search heuristic is capable of generating robust schedules with little average deviation from the optimal objective function value (obtained via the branch-and-bound algorithm) and it performs significantly better than a number of heuristics available in the literature for all five measures. © 2010 "IIE"
Bio-energy from Mountain Pine Beetle Timber and Forest Residuals: The Economics Story
In light of the large volumes of pine killed in the Interior forests in British Columbia by the mountain pine beetle, many are keen to employ forest biomass as an energy source. To assess the feasibility of a wood biomass-fired power plant in the BC Interior it is necessary to know both how much physical biomass might be available over the life of a plant, but also its location because transportation costs are likely to be a major operating cost for any facility. To address these issues, we construct a mathematical programming model of fiber flows in the Quesnel Timber Supply Area of BC over a 25-year time horizon. The focus of the model is on minimizing the cost of supplying feedstock throughout space and time. Results indicate that over the life of the project feedstock costs will more than double, increasing from 0.039/kWh) to 0.083/kWh).forest economics, biomass and bio-energy, forest pests
Minimizing value-at-risk in single-machine scheduling
The vast majority of the machine scheduling literature focuses on deterministic problems in which all data is known with certainty a priori. In practice, this assumption implies that the random parameters in the problem are represented by their point estimates in the scheduling model. The resulting schedules may perform well if the variability in the problem parameters is low. However, as variability increases accounting for this randomness explicitly in the model becomes crucial in order to counteract the ill effects of the variability on the system performance. In this paper, we consider single-machine scheduling problems in the presence of uncertain parameters. We impose a probabilistic constraint on the random performance measure of interest, such as the total weighted completion time or the total weighted tardiness, and introduce a generic risk-averse stochastic programming model. In particular, the objective of the proposed model is to find a non-preemptive static job processing sequence that minimizes the value-at-risk (VaR) of the random performance measure at a specified confidence level. We propose a Lagrangian relaxation-based scenario decomposition method to obtain lower bounds on the optimal VaR and provide a stabilized cut generation algorithm to solve the Lagrangian dual problem. Furthermore, we identify promising schedules for the original problem by a simple primal heuristic. An extensive computational study on two selected performance measures is presented to demonstrate the value of the proposed model and the effectiveness of our solution method
Population-based algorithms for improved history matching and uncertainty quantification of Petroleum reservoirs
In modern field management practices, there are two important steps that shed light on a multimillion dollar investment. The first step is history matching where the simulation model is calibrated to reproduce the historical observations from the field. In this inverse problem, different geological and petrophysical properties may provide equally good history matches. Such diverse models are likely to show different production behaviors in future. This ties the history matching with the second step, uncertainty quantification of predictions. Multiple history matched models are essential for a realistic uncertainty estimate of the future field behavior. These two steps facilitate decision making and have a direct impact on technical and financial performance of oil and gas companies.
Population-based optimization algorithms have been recently enjoyed growing popularity for solving engineering problems. Population-based systems work with a group of individuals that cooperate and communicate to accomplish a task that is normally beyond the capabilities of each individual. These individuals are deployed with the aim to solve the problem with maximum efficiency.
This thesis introduces the application of two novel population-based algorithms for history matching and uncertainty quantification of petroleum reservoir models. Ant colony optimization and differential evolution algorithms are used to search the space of parameters to find multiple history matched models and, using a Bayesian framework, the posterior probability of the models are evaluated for prediction of reservoir performance.
It is demonstrated that by bringing latest developments in computer science such as ant colony, differential evolution and multiobjective optimization, we can improve the history matching and uncertainty quantification frameworks. This thesis provides insights into performance of these algorithms in history matching and prediction and develops an understanding of their tuning parameters. The research also brings a comparative study of these methods with a benchmark technique called Neighbourhood Algorithms. This comparison reveals the superiority of the proposed methodologies in various areas such as computational efficiency and match quality
- âŠ