22 research outputs found

    Economic analysis of royalactin production under uncertainty: Evaluating the effect of parameter optimization.

    Get PDF
    Royalactin is a protein with several different potential uses in humans. Research, in insects and in mammalian cells, has shown that it can accelerate cell division and prevent apoptosis. The method of action is through the use of the epidermal growth factor receptor, which is present in humans. Potential use in humans could be to lower cholesterolemic levels in blood, and to elicit similar effects to those seen in bees, e.g., increased lifespan. Mass production of Royalactin has not been accomplished, though a recent article presented a Pichia pastoris fermentation and recovery by aqueous two-phase systems at laboratory scale as a possible basis for production. Economic modelling is a useful tool with which compare possible outcomes for the production of such a molecule and in particular, to locate areas where additional research is needed and optimization may be required. This study uses the BioSolve software to perform an economic analysis on the scale-up of the putative process for Royalactin. The key parameters affecting the cost of production were located via a sensitivity analysis and then evaluated by Monte Carlo analysis. Results show that if titer is not optimized the strategy to maintain a low cost of goods is process oriented. After optimization of this parameter the strategy changes to a product-oriented and the target output becomes the critical parameter determining the cost of goods. This study serves to provide a framework for the evaluation of strategies for future production of Royalactin, by analyzing the factors that influence its cost of manufacture. © 2015 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 2015

    Integrated economic and experimental framework for screening of primary recovery technologies for high cell density CHO cultures

    Get PDF
    Increases in mammalian cell culture titres and densities have placed significant demands on primary recovery operation performance. This article presents a methodology which aims to screen rapidly and evaluate primary recovery technologies for their scope for technically feasible and cost-effective operation in the context of high cell density mammalian cell cultures. It was applied to assess the performance of current (centrifugation and depth filtration options) and alternative (tangential flow filtration (TFF)) primary recovery strategies. Cell culture test materials (CCTM) were generated to simulate the most demanding cell culture conditions selected as a screening challenge for the technologies. The performance of these technology options was assessed using lab scale and ultra scale-down (USD) mimics requiring 25-110mL volumes for centrifugation and depth filtration and TFF screening experiments respectively. A centrifugation and depth filtration combination as well as both of the alternative technologies met the performance selection criteria. A detailed process economics evaluation was carried out at three scales of manufacturing (2,000L, 10,000L, 20,000L), where alternative primary recovery options were shown to potentially provide a more cost-effective primary recovery process in the future. This assessment process and the study results can aid technology selection to identify the most effective option for a specific scenario

    Economic analysis of Uricase production under uncertainty: Contrast of chromatographic purification and aqueous two-phase extraction (with and without PEG recycle)

    Get PDF
    Uricase is the enzyme responsible for the breakdown of uric acid, the key molecule leading to gout in humans, into allantoin, but it is absent in humans. It has been produced as a PEGylated pharmaceutical where the purification is performed through three sequential chromatographic columns. More recently an aqueous two-phase system (ATPS) was reported that could recover Uricase with high yield and purity. Although the use of ATPS can decrease cost and time, it also generates a large amount of waste. The ability, therefore, to recycle key components of ATPS is of interest. Economic modelling is a powerful tool that allows the bioprocess engineer to compare possible outcomes and find areas where further research or optimization might be required without recourse to extensive experiments and time. This research provides an economic analysis using the commercial software BioSolve of the strategies for Uricase production: chromatographic and ATPS, and includes a third bioprocess that utilises material recycling. The key parameters that affect the process the most were located via a sensitivity analysis and evaluated with a Monte Carlo analysis. Results show that ATPS is far less expensive than chromatography, but that there is an area where the cost of production of both bioprocesses overlap. Furthermore, recycling doesn't impact the cost of production. This study serves to provide a framework for the economic analysis of Uricase production using alternative techniques. This article is protected by copyright. All rights reserved

    Optimisation-based Framework for Resin Selection Strategies in Biopharmaceutical Purification Process Development

    Get PDF
    This work addresses rapid resin selection for integrated chromatographic separations when conducted as part of a high-throughput screening (HTS) exercise during the early stages of purification process development. An optimisation-based decision support framework is proposed to process the data generated from microscale experiments in order to identify the best resins to maximise key performance metrics for a biopharmaceutical manufacturing process, such as yield and purity. A multiobjective mixed integer nonlinear programming (MINLP) model is developed and solved using the ε-constraint method. Dinkelbach's algorithm is used to solve the resulting mixed integer linear fractional programming (MILFP) model. The proposed framework is successfully applied to an industrial case study of a process to purify recombinant Fc Fusion protein from low molecular weight and high molecular weight product related impurities, involving two chromatographic steps with 8 and 3 candidate resins for each step, respectively. The computational results show the advantage of the proposed framework in terms of computational efficiency and flexibility. This article is protected by copyright. All rights reserved

    Effects of bed compression on protein separation on gel filtration chromatography at bench and pilot scale

    Get PDF
    BACKGROUND: Poorly packed chromatography columns are known to reduce drastically the column efficiency and produce broader peaks. Controlled bed compression has been suggested to be a useful approach for solving this problem. Here the relationship between column efficiency and resolution of protein separation are examined when preparative chromatography media were compressed using mechanical and hydrodynamic methods. Sepharose CL-6B, an agarose based size exclusion media was examined at bench and pilot scale. The asymmetry and height equivalent of a theoretical plate (HETP) was determined by using 2% v/v acetone, whereas the void volume and intraparticle porosity (ε p ) were estimated by using blue dextran. A protein mixture of ovalbumin (chicken), bovine serum albumin (BSA) and γ'- globulin (bovine) with molecular weights of 44, 67 and 158 kDa, respectively, were used as a 'model' separation challenge. RESULTS: Mechanical compression achieved a reduction in plate height for the column with a concomitant improvement in asymmetry. Furthermore, the theoretical plate height decreased significantly with mechanical compression resulting in a 40% improvement in purity compared with uncompressed columns at the most extreme conditions of compression used. CONCLUSION: The results suggest that the mechanical bed compression of Sepharose CL-6B can be used to improve the resolution of protein separation

    A Framework for Assessing the Solutions in Chromatographic Process Design and Operation for Large Scale Manufacture

    Get PDF
    Chromatographic separation of biopharmaceuticals is complex and tools for the prediction of performance and the trade-offs necessary for efficient operation are limited and time-consuming. This complexity is due to the large number of possible column aspect ratios that satisfy process and economic needs. This paper demonstrates a framework for the design and analysis of chromatographic steps. The functionalities are illustrated by application to a Protein A separation where the effects of column diameter, bed length and linear flow rate on cost of goods (COG/g) and productivity (g/h) are investigated so as to identify the optimal operating strategy. Results are presented as a series of ‘windows of operation’ to address key design and operating decisions. The tool allows the designer to customise limiting constraints based on product and process specific knowledge. Results indicate the significant impact on COG/g of column over-sizing and how this can be balanced by increased levels of productivity

    A Framework for Assessing the Solutions in Chromatographic Process Design and Operation for Large Scale Manufacture

    Get PDF
    Chromatographic separation of biopharmaceuticals is complex and tools for the prediction of performance and the trade-offs necessary for efficient operation are limited and time-consuming. This complexity is due to the large number of possible column aspect ratios that satisfy process and economic needs. This paper demonstrates a framework for the design and analysis of chromatographic steps. The functionalities are illustrated by application to a Protein A separation where the effects of column diameter, bed length and linear flow rate on cost of goods (COG/g) and productivity (g/h) are investigated so as to identify the optimal operating strategy. Results are presented as a series of ‘windows of operation’ to address key design and operating decisions. The tool allows the designer to customise limiting constraints based on product and process specific knowledge. Results indicate the significant impact on COG/g of column over-sizing and how this can be balanced by increased levels of productivity

    The Simplex Algorithm for the Rapid Identification of Operating Conditions During Early Bioprocess Development: Case Studies in FAb' Precipitation and Multimodal Chromatography

    Get PDF
    This study describes a data-driven algorithm as a rapid alternative to conventional Design of Experiments (DoE) approaches for identifying feasible operating conditions during early bioprocess development. In general, DoE methods involve fitting regression models to experimental data, but if model fitness is inadequate then further experimentation is required to gain more confidence in the location of an optimum. This can be undesirable during very early process development when feedstock is in limited supply and especially if a significant percentage of the tested conditions are ultimately found to be sub-optimal. An alternative approach involves focusing solely upon the feasible regions by using the knowledge gained from each condition to direct the choice of subsequent test locations that lead towards an optimum. To illustrate the principle, this study describes the application of the Simplex algorithm which uses accumulated knowledge from previous test points to direct the choice of successive conditions towards better regions. The method is illustrated by two case studies; a two variable precipitation example investigating how salt concentration and pH affect FAb' recovery from E. coli homogenate and a three-variable chromatography example identifying the optimal pH and concentrations of two salts in an elution buffer used to recover ovine antibody bound to a multimodal cation exchange matrix. Two-level and face-centered central composite regression models were constructed for each study and statistical analysis showed that they provided a poor fit to the data, necessitating additional experimentation to confirm the robust regions of the search space. By comparison, the Simplex algorithm identified a good operating point using 50% and 70% fewer conditions for the precipitation and chromatography studies, respectively. Hence, data-driven approaches have significant potential for early process development when material supply is at a premium

    Predicting performance of constant flow depth filtration using constant pressure filtration data

    Get PDF
    This paper describes a method of predicting constant flow filtration capacities using constant pressure datasets collected during the purification of several monoclonal antibodies through depth filtration. The method required characterisation of the fouling mechanism occurring in constant pressure filtration processes by evaluating the best fit of each of the classic and combined theoretical fouling models. The optimised coefficients of the various models were correlated with the corresponding capacities achieved during constant flow operation at the specific pressures performed during constant pressure operation for each centrate. Of the classic and combined fouling models investigated, the Cake-Adsorption fouling model was found to best describe the fouling mechanisms observed for each centrate at the various different pressures investigated. A linear regression model was generated with these coefficients and was shown to predict accurately the capacities at constant flow operation at each pressure. This model was subsequently validated using an additional centrate and accurately predicted the constant flow capacities at three different pressures (0.69, 1.03 and 1.38 bar). The model used the optimised Cake-Adsorption model coefficients that best described the flux decline during constant pressure operation. The proposed method of predicting depth filtration performance proved to be faster than the traditional approach whilst requiring significantly less material, making it particularly attractive for early process development activities

    Integration of host strain bioengineering and bioprocess development using ultra-scale down studies to select the optimum combination: An antibody fragment primary recovery case study.

    Get PDF
    An ultra scale-down primary recovery sequence was established for a platform E. coli Fab production process. It was used to evaluate the process robustness of various bioengineered strains. Centrifugal discharge in the initial dewatering stage was determined to be the major cause of cell breakage. The ability of cells to resist breakage was dependant on a combination of factors including host strain, vector, and fermentation strategy. Periplasmic extraction studies were conducted in shake flasks and it was demonstrated that key performance parameters such as Fab titre and nucleic acid concentrations were mimicked. The shake flask system also captured particle aggregation effects seen in a large scale stirred vessel, reproducing the fine particle size distribution that impacts the final centrifugal clarification stage. The use of scale-down primary recovery process sequences can be used to screen a larger number of engineered strains. This can lead to closer integration with and better feedback between strain development, fermentation development and primary recovery studies. Biotechnol. Bioeng. © 2014 Wiley Periodicals, Inc
    corecore