12,318 research outputs found

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Integration of cost-risk assessment of denial of service within an intelligent maintenance system

    Get PDF
    As organisations become richer in data the function of asset management will have to increasingly use intelligent systems to control condition monitoring systems and organise maintenance. In the future the UK rail industry is anticipating having to optimize capacity by running trains closer to each other. In this situation maintenance becomes extremely problematic as within such a high-performance network a relatively minor fault will impact more trains and passengers; such denial of service causes reputational damage for the industry and causes fines to be levied against the infrastructure owner, Network Rail. Intelligent systems used to control condition monitoring systems will need to optimize for several factors; optimization for minimizing denial of service will be one such factor. With schedules anticipated to be increasingly complicated detailed estimation methods will be extremely difficult to implement. Cost prediction of maintenance activities tend to be expert driven and require extensive details, making automation of such an activity difficult. Therefore a stochastic process will be needed to approach the problem of predicting the denial of service arising from any required maintenance. Good uncertainty modelling will help to increase the confidence of estimates. This paper seeks to detail the challenges that the UK Railway industry face with regards to cost modelling of maintenance activities and outline an example of a suitable cost model for quantifying cost uncertainty. The proposed uncertainty quantification is based on historical cost data and interpretation of its statistical distributions. These estimates are then integrated in a cost model to obtain accurate uncertainty measurements of outputs through Monte-Carlo simulation methods. An additional criteria of the model was that it be suitable for integration into an existing prototype integrated intelligent maintenance system. It is anticipated that applying an integrated maintenance management system will apply significant downward pressure on maintenance budgets and reduce denial of service. Accurate cost estimation is therefore of great importance if anticipated cost efficiencies are to be achieved. While the rail industry has been the focus of this work, other industries have been considered and it is anticipated that the approach will be applicable to many other organisations across several asset management intensive industrie

    The safety case and the lessons learned for the reliability and maintainability case

    Get PDF
    This paper examine the safety case and the lessons learned for the reliability and maintainability case

    A review of tools, models and techniques for long-term assessment of distribution systems using OpenDSS and parallel computing

    Get PDF
    Many distribution system studies require long-term evaluations (e.g. for one year or more): Energy loss minimization, reliability assessment, or optimal rating of distributed energy resources should be based on long-term simulations of the distribution system. This paper summarizes the work carried out by the authors to perform long-term studies of large distribution systems using an OpenDSS-MATLAB environment and parallel computing. The paper details the tools, models, and procedures used by the authors in optimal allocation of distributed resources, reliability assessment of distribution systems with and without distributed generation, optimal rating of energy storage systems, or impact analysis of the solid state transformer. Since in most cases, the developed procedures were implemented for application in a multicore installation, a summary of capabilities required for parallel computing applications is also included. The approaches chosen for carrying out those studies used the traditional Monte Carlo method, clustering techniques or genetic algorithms. Custom-made models for application with OpenDSS were required in some studies: A summary of the characteristics of those models and their implementation are also included.Peer ReviewedPostprint (published version

    Review of Researches on Techno-Economic Analysis and Environmental Impact of Hybrid Energy Systems

    Get PDF
    Hybrid energy systems, which are combinations of two or more renewable and non-renewable energy sources, have been identified as a viable mechanism to address the limitations of a single renewable energy source, utilized for electricity generation. In view of this, several research works have been carried out to determine the optimal mix of different renewable and non-renewable energy resources used for electricity generation. This paper presents a comprehensive review of the optimization approaches proposed and adopted by various authors in the literature for optimal sizing of hybrid energy systems. It is observed that the objective functions - considered by a large percentage of researchers to optimize the sizing of hybrid energy systems - are cost minimization of the generated electricity, system reliability enhancement and environmental pollution reduction. Other factors covered in the literature are equally discussed in this article. Similarly, simulation and optimization software used for the same purpose are covered in the paper. In essence, the main aim of this paper is to provide a scope into the works that have been carried out in the field of hybrid energy systems, used for electricity generation with the view to informing researchers and members of the public alike, on trends in methods applied in optimal sizing of hybrid energy systems. It is believed that the information provided in this paper is very crucial in advancing research in the field

    An Evolutionary Computational Approach for the Problem of Unit Commitment and Economic Dispatch in Microgrids under Several Operation Modes

    Get PDF
    In the last decades, new types of generation technologies have emerged and have been gradually integrated into the existing power systems, moving their classical architectures to distributed systems. Despite the positive features associated to this paradigm, new problems arise such as coordination and uncertainty. In this framework, microgrids constitute an effective solution to deal with the coordination and operation of these distributed energy resources. This paper proposes a Genetic Algorithm (GA) to address the combined problem of Unit Commitment (UC) and Economic Dispatch (ED). With this end, a model of a microgrid is introduced together with all the control variables and physical constraints. To optimally operate the microgrid, three operation modes are introduced. The first two attend to optimize economical and environmental factors, while the last operation mode considers the errors induced by the uncertainties in the demand forecasting. Therefore, it achieves a robust design that guarantees the power supply for different confidence levels. Finally, the algorithm was applied to an example scenario to illustrate its performance. The achieved simulation results demonstrate the validity of the proposed approach.Ministerio de Ciencia, Innovación y Universidades TEC2016-80242-PMinisterio de Economía y Competitividad PCIN-2015-043Universidad de Sevilla Programa propio de I+D+
    corecore