4 research outputs found

    Managing transfer and scale-up of a process with atypical impact of dissolved oxygen concentration on productivity and product quality

    Get PDF
    Dissolved oxygen (DO) is a routinely measured and controlled process parameter in mammalian cell cultures for monoclonal antibody production in stirred tank bioreactors. For typical Chinese hamster ovary (CHO) cell lines, DO is controlled around a specific set-point, but growth, productivity, and product quality are relatively independent of DO over a wide range relative to controller capability. Thus DO control is primarily used to ensure sufficient oxygen is provided to the cells to support their metabolism during growth and antibody production. Such processes can be transferred from one facility or scale to another with limited concern for detailed analysis of potential DO gradients within the bioreactor or differences in probe handling and pressure compensation methods. This paper describes challenges associated with DO impact to productivity and product quality in a low-density CHO fed-batch process executed at 15 mL, 2 L, 12 kL, and 20 kL bioreactor scales. The work was initially motivated by unexpectedly low productivity at the 20 kL scale. Due to gradients within the 20 kL bioreactor and differences in pressure compensation strategies, the actual DO concentration during the run was up to 175% of the concentration at the 2 L process development scale. Subsequent experiments at the 15 mL and 2 L scales showed an inverse correlation between titer and DO set-point over the range of 10% to 60% air saturation. For the 2nd run at the 20 kL scale, the set-point was lowered and pressure compensation methods were adjusted, resulting in a significantly higher titer. The lower effective DO concentration was also applied at a second manufacturing facility, where a higher titer was again achieved. While product quality was acceptable for the large scale runs with lower DO, process characterization studies demonstrated that DO set-point was correlated with the charge heterogeneity profile (Figure 1). The ideal DO range for higher productivity was correlated with higher likelihood of a charge heterogeneity profile outside of the target performance range. This presentation describes how statistical models generated from process characterization data, along with considerations of bioreactor configuration, mixing, and gassing strategies can be applied to develop a manufacturing process to simultaneously deliver acceptable product quality and meet productivity requirements. Please click Additional Files below to see the full abstract

    More accurate process understanding from process characterization studies using Monte Carlo simulation, regularized regression, and classification models

    Get PDF
    Establishment of an appropriate control strategy with defined operating ranges (OR) predicted to meet a target product profile is a critical component of commercializing new biologics under the Quality by Design (QbD) approach. Process characterization (PC) studies are performed to expand process understanding by achieving two main goals: 1) determining which process parameters have significant effects on quality attributes and 2) establishing models describing the relationships between these critical process parameters (CPP) and critical quality attributes (CQA). Risk assessment and design of experiments (DOE) techniques are effectively deployed in the industry to identify parameters to study and build process understanding. However, the true value of the data produced by these studies can be compromised by the inherent flaws with traditional data analysis techniques. In particular, p-value based methods such as stepwise regression are prone to generate false positives and overestimated parameter coefficients. Many of the deficiencies of traditional stepwise regression can be alleviated by applying Monte Carlo cross validation (MCCV) and simulations to stepwise algorithms. These methods can greatly enhance process understanding and assist in the selection of CPPs. Regularized regression methods such as LASSO, ridge, and elastic net are also designed to overcome many of the issues inherent in techniques based on ordinary least squares. However, a superior strategy is to build multiple models using a variety of techniques and use the insights gained from each to establish the relationships between CPPs and CQAs. Use of complementary methods during data analysis allows more informed decisions to be made during model construction. Please click Additional Files below to see the full abstract

    Nanoparticles for Applications in Cellular Imaging

    Get PDF
    In the following review we discuss several types of nanoparticles (such as TiO2, quantum dots, and gold nanoparticles) and their impact on the ability to image biological components in fixed cells. The review also discusses factors influencing nanoparticle imaging and uptake in live cells in vitro. Due to their unique size-dependent properties nanoparticles offer numerous advantages over traditional dyes and proteins. For example, the photostability, narrow emission peak, and ability to rationally modify both the size and surface chemistry of Quantum Dots allow for simultaneous analyses of multiple targets within the same cell. On the other hand, the surface characteristics of nanometer sized TiO2allow efficient conjugation to nucleic acids which enables their retention in specific subcellular compartments. We discuss cellular uptake mechanisms for the internalization of nanoparticles and studies showing the influence of nanoparticle size and charge and the cell type targeted on nanoparticle uptake. The predominant nanoparticle uptake mechanisms include clathrin-dependent mechanisms, macropinocytosis, and phagocytosis
    corecore