1,020 research outputs found

    Economic analyses for the evaluation of is projects

    Get PDF
    Information system projects usually have numerous uncertainties and several conditions of risk that make their economic evaluation a challenging task. Each year, several information system projects are cancelled before completion as a result of budget overruns at a cost of several billions of dollars to industry. Although engineering economic analysis offers tools and techniques for evaluating risky projects, the tools are not enough to place information system projects on a safe budget/selection track. There is a need for an integrative economic analysis model that will account for the uncertainties in estimating project costs benefits and useful lives of uncertain and risky projects. The fuzzy set theory has the capability of representing vague data and allows mathematical operators and programming to be applied to the fuzzy domain. The theory is primarily concerned with quantifying the vagueness in human thoughts and perceptions. In this article, the economic evaluation of information system projects using fuzzy present value and fuzzy B/C ratio is analyzed. A numerical illustration is included to demonstrate the effectiveness of the proposed methods

    Multilevel decision-making: A survey

    Full text link
    © 2016 Elsevier Inc. All rights reserved. Multilevel decision-making techniques aim to deal with decentralized management problems that feature interactive decision entities distributed throughout a multiple level hierarchy. Significant efforts have been devoted to understanding the fundamental concepts and developing diverse solution algorithms associated with multilevel decision-making by researchers in areas of both mathematics/computer science and business areas. Researchers have emphasized the importance of developing a range of multilevel decision-making techniques to handle a wide variety of management and optimization problems in real-world applications, and have successfully gained experience in this area. It is thus vital that a high quality, instructive review of current trends should be conducted, not only of the theoretical research results but also the practical developments in multilevel decision-making in business. This paper systematically reviews up-to-date multilevel decision-making techniques and clusters related technique developments into four main categories: bi-level decision-making (including multi-objective and multi-follower situations), tri-level decision-making, fuzzy multilevel decision-making, and the applications of these techniques in different domains. By providing state-of-the-art knowledge, this survey will directly support researchers and practical professionals in their understanding of developments in theoretical research results and applications in relation to multilevel decision-making techniques

    Role of Optimal Production Plan at the Focal Firm in Optimization of the Supply Chain

    Get PDF
    Supply chain management and optimization is a critical aspect of modern enterprises and an expanding area of research. Modeling and optimization are the traditional tools of supply chain management. The techniques have been used by many companies for planning, manufacturing, and other decision areas in supply chains. Current study is motivated by the fact that optimization studies in supply chain management have mostly considered network optimization. Supply chain management however, requires alignment between the supply chain partners at the tactical level. As a first step towards achieving this goal, current study presents a model that incorporates the activity level planning at the focal firm in a supply chain. This paper presents a new mixed integer programming model that incorporates optimization of production planning at the focal firm while optimizing the strategic alignment of the supply chain entities. The model represents a four step, multi-echelon supply chain including supplier, warehouse, manufacturer, and retailer. The manufacturer in this network represents the focal firm. This model is an attempt to integrate the production planning decisions in the network optimization decisions

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Market and Economic Modelling of the Intelligent Grid: End of Year Report 2009

    Get PDF
    The overall goal of Project 2 has been to provide a comprehensive understanding of the impacts of distributed energy (DG) on the Australian Electricity System. The research team at the UQ Energy Economics and Management Group (EEMG) has constructed a variety of sophisticated models to analyse the various impacts of significant increases in DG. These models stress that the spatial configuration of the grid really matters - this has tended to be neglected in economic discussions of the costs of DG relative to conventional, centralized power generation. The modelling also makes it clear that efficient storage systems will often be critical in solving transient stability problems on the grid as we move to the greater provision of renewable DG. We show that DG can help to defer of transmission investments in certain conditions. The existing grid structure was constructed with different priorities in mind and we show that its replacement can come at a prohibitive cost unless the capability of the local grid to accommodate DG is assessed very carefully.Distributed Generation. Energy Economics, Electricity Markets, Renewable Energy

    Application of Simple Smart Logic for Waterflooding Reservoir Management

    Get PDF
    A simple smart logic for controlling inflow control valves (ICV) in waterflooding reservoir management is implemented and analyzed, with the final objective of improving the long term financial return of a petroleum reservoir. Such a control is based in a reactive simple logic that responds to the watercut measured in the ICV. Basically, when the watercut increases, the ICV is set to close proportionally. For comparison purposes, four strategies are presented: base case scenario with conventional control, the best completion configuration found by trial-and-error, the reactive control, and a deterministic optimal control based on Nonlinear Gradient Method with adjoint-gradient formulation is shown for comparison purposes. Finally, all four strategies are tested again in different reservoir realizations in order to mimic the geological uncertainties. Two different synthetic reservoir models were studied. First, a simple cube with a five-spot well configuration, in which the permeability field has a horizontal pattern defined by lognormal distributions. The second model is a benchmark proposed by the Dutch university, TU delft, with 101 channelized permeability fields representing river patterns. For the first model, no significant relative gain is found neither in the variable control nor in the optimal control. Manly because of the high homogeneity of the reservoir models. Therefore, no intelligent completion is recommended. On the other hand, for the second and more complex case, the results indicate an expressive relative gain in the use of simple reactive logic. Besides, this type of control achieves results nearly as good as the optimal control. The test in different realizations, however, shows that reservoir characterization is still a key part of any attempt to improve production. Although the variable reactive control is semi-independent, with action being taken based on measurements, some parameters need a priori model to be tuned
    corecore