1,309 research outputs found

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Multiobjective strategies for New Product Development in the pharmaceutical industry

    Get PDF
    New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline. Formally, the NPD problem can be stated as follows: select a set of R&D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while coping with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGAII type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. This work is illustrated with a study case involving nine interdependent new product candidates targeting three diseases. An analysis is performed for this test bench on the different pairs of criteria both for the bi- and tricriteria optimization: large portfolios cause resource queues and delays time to launch and are eliminated by the bi- and tricriteria optimization strategy. The optimization strategy is thus interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems

    Production planning of biopharmaceutical manufacture.

    Get PDF
    Multiproduct manufacturing facilities running on a campaign basis are increasingly becoming the norm for biopharmaceuticals, owing to high risks of clinical failure, regulatory pressures and the increasing number of therapeutics in clinical evaluation. The need for such flexible plants and cost-effective manufacture pose significant challenges for planning and scheduling, which are compounded by long production lead times, intermediate product stability issues and the high cost - low volume nature of biopharmaceutical manufacture. Scheduling and planning decisions are often made in the presence of variable product titres, campaign durations, contamination rates and product demands. Hence this thesis applies mathematical programming techniques to the planning of biopharmaceutical manufacture in order to identify more optimal production plans under different manufacturing scenarios. A deterministic mixed integer linear programming (MILP) medium term planning model which explicitly accounts for upstream and downstream processing is presented. A multiscenario MILP model for the medium term planning of biopharmaceutical manufacture under uncertainty is presented and solved using an iterative solution procedure. An alternative stochastic formulation for the medium term planning of biomanufacture under uncertainty based on the principles of chance constrained programming is also presented. To help manage the risks of long term capacity planning in the biopharmaceutical industry, a goal programming extension is presented which accounts for multiple objectives including cost, risk and customer service level satisfaction. The model is applied to long term capacity analysis of a mix of contractors and owned biopharmaceutical manufacturing facilities. In the final sections of this thesis an example of a commercial application of this work is presented, followed by a discussion on related validation issues in the biopharmaceutical industry. The work in this thesis highlighted the benefits of applying mathematical programming techniques for production planning of biopharmaceutical manufacturing facilities, so as to enhance the biopharmaceutical industry's strategic and operational decision-making towards achieving more cost-effective manufacture

    Economic and environmental strategies for process design

    Get PDF
    This paper first addresses the definition of various objectives involved in eco-efficient processes, taking simultaneously into account ecological and economic considerations. The environmental aspect at the preliminary design phase of chemical processes is quantified by using a set of metrics or indicators following the guidelines of sustainability concepts proposed by . The resulting multiobjective problem is solved by a genetic algorithm following an improved variant of the so-called NSGA II algorithm. A key point for evaluating environmental burdens is the use of the package ARIANEℱ, a decision support tool dedicated to the management of plants utilities (steam, electricity, hot water, etc.) and pollutants (CO2, SO2, NO, etc.), implemented here both to compute the primary energy requirements of the process and to quantify its pollutant emissions. The well-known benchmark process for hydrodealkylation (HDA) of toluene to produce benzene, revisited here in a multiobjective optimization way, is used to illustrate the approach for finding eco-friendly and cost-effective designs. Preliminary biobjective studies are carried out for eliminating redundant environmental objectives. The trade-off between economic and environmental objectives is illustrated through Pareto curves. In order to aid decision making among the various alternatives that can be generated after this step, a synthetic evaluation method, based on the so-called Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) (), has been first used. Another simple procedure named FUCA has also been implemented and shown its efficiency vs. TOPSIS. Two scenarios are studied; in the former, the goal is to find the best trade-off between economic and ecological aspects while the latter case aims at defining the best compromise between economic and more strict environmental impact

    Modeling, optimization, and sensitivity analysis of a continuous multi-segment crystallizer for production of active pharmaceutical ingredients

    Get PDF
    We have investigated the simulation-based, steady-state optimization of a new type of crystallizer for the production of pharmaceuticals. The multi-segment, multi-addition plug-flow crystallizer (MSMA-PFC) offers better control over supersaturation in one dimension compared to a batch or stirred-tank crystallizer. Through use of a population balance framework, we have written the governing model equations of population balance and mass balance on the crystallizer segments. The solution of these equations was accomplished through either the method of moments or the finite volume method. The goal was to optimize the performance of the crystallizer with respect to certain quantities, such as maximizing the mean crystal size, minimizing the coefficient of variation, or minimizing the sum of the squared errors when attempting to hit a target distribution. Such optimizations are all highly nonconvex, necessitating the use of the genetic algorithm. Our results for the optimization of a process for crystallizing flufenamic acid showed improvement in crystal size over prior literature results. Through the use of a novel simultaneous design and control (SDC) methodology, we have further optimized the flowrates and crystallizer geometry in tandem.^ We have further investigated the robustness of this process and observe significant sensitivity to error in antisolvent flowrate, as well as the kinetic parameters of crystallization. We have lastly performed a parametric study on the use of the MSMA-PFC for in-situ dissolution of fine crystals back into solution. Fine crystals are a known processing difficulty in drug manufacture, thus motivating the development of a process that can eliminate them efficiently. Prior results for cooling crystallization indicated this to be possible. However, our results show little to no dissolution is used after optimizing the crystallizer, indicating the negative impact of adding pure solvent to the process (reduced concentration via dilution, and decreased residence time) outweighs the positive benefits of dissolving fines. The prior results for cooling crystallization did not possess this coupling between flowrate, residence time, and concentration, thus making fines dissolution significantly more beneficial for that process. We conclude that the success observed in hitting the target distribution has more to do with using multiple segments and having finer control over supersaturation than with the ability to go below solubility. Our results showed that excessive nucleation still overwhelms the MSMA-PFC for in-situ fines dissolution when nucleation is too high

    Supplier selection under disaster uncertainty with joint procurement

    Get PDF
    Master of ScienceDepartment of Industrial & Manufacturing Systems EngineeringJessica L. Heier StammHealth care organizations must have enough supplies and equipment on hand to adequately respond to events such as terrorist attacks, infectious disease outbreaks, and natural disasters. This is achieved through a robust supply chain system. Nationwide, states are assessing their current supply chains to identify gaps that may present issues during disaster preparedness and response. During an assessment of the Kansas health care supply chain, a number of vulnerabilities were identified, one of which being supplier consolidation. Through mergers and acquisitions, the number of suppliers within the health care field has been decreasing over the years. This can pose problems during disaster response when there is a surge in demand and multiple organizations are relying on the same suppliers to provide equipment and supplies. This thesis explores the potential for joint procurement agreements to encourage supplier diversity by splitting purchasing among multiple suppliers. In joint procurement, two or more customers combine their purchases into one large order so that they can receive quantity discounts from a supplier. This research makes three important contributions to supplier selection under disaster uncertainty. The first of these is the development of a scenario-based supplier selection model under uncertainty with joint procurement. This optimization model can be used to observe customer purchasing decisions in various scenarios while considering the probability of disaster occurrence. Second, the model is applied to a set of experiments to analyze the results when supplier diversity is increased and when joint procurement is introduced. This leads to the third and final contribution: a set of recommendations for health care organization decision makers regarding ways to increase supplier diversity and decrease the risk of disruption associated with disaster occurrence

    A robust R&D project portfolio optimization model for pharmaceutical contract research organizations

    Get PDF
    Pharmaceutical drug Research and Development (R&D) outsourcing to contract research organizations (CROs) has experienced a significant growth in recent decades and the trend is expected to continue. A key question for CROs and firms in similar environments is which projects should be included in the firm?s portfolio of projects. As a distinctive contribution to the literature this paper develops and evaluates a business support tool to help a CRO decide on clinical R&D project opportunities and revise its portfolio of R&D projects given the existing constraints, and financial and resource capabilities. A new mathematical programming model in the form of a capital budgeting problem is developed to help revising and rescheduling of the project portfolio. The uncertainty of pharmaceutical R&D cost estimates in drug development stages is captured to mimic a more realistic representation of pharmaceutical R&D projects, and a robust optimization approach is used to tackle the uncertain formulation. An illustrative example is presented to demonstrate the proposed approach

    An optimization framework to combine operable space maximization with design of experiments.

    Get PDF
    The introduction of Quality by Design in the pharmaceutical industry stimulates practitioners to better understand the relationship of materials, processes and products. One way to achieve this is through the use of targeted experimentation. In this study, an optimization framework to design experiments that effectively leverage parameterized process models is presented to maximize the space covered in the output variables while also obtaining an orthogonal bracketing study in the process input factors. The framework considers both multi‐objective and bilevel optimization methods for relating the two maximization objectives. Results are presented for two case studies—a spray coating process and a continuously stirred reactor cascade—demonstrating the ability to generate and identify efficient designs with fit‐for‐purpose trade‐offs between bracketed orthogonality in the input factors and volume explored in the process output space. The proposed approach allows a more complete understanding of the process to emerge from a small set of experiments
    • 

    corecore