296 research outputs found

    ANTISENSE MEDIATED DYSTROPHIN READING FRAME RESTORATION

    Get PDF
    Exon skipping using antisense oligonucleotides (AONs) has successfully been used to reframe the mRNA in various DMD (Duchenne muscular dystrophy) patients carrying deletions and in the mdx mouse model. This study can be devided in two parts: in the first part we have tested the feasibility of the exon skipping approach for patients with small mutations in in-frame exons, while in the second part a quantitative comparison of exon skipping revealing techniques is addressed. We first identified 55 novel disease-causing point mutations. We selected 5 patients with nonsense or frameshifting mutations in exons 10, 16, 26, 33 and 34. Wild type and mutation specific 2‟OMePS AONs were tested in cell-free splicing assays and in cultured cells derived from the selected patients. The results obtained confirm cell-free splicing assay as an alternative system to test exon skipping propensity when patients‟ cells are unavailable. In myogenic cells, similar levels of exon skipping were observed for wild type and mutation specific AONs for exons 16, 26 and 33, while for exon 10 and exon 34 the efficiency of the AONs was significantly different. Interestingly, in some cases skipping efficiencies for mutated exons were quite dissimilar compared to what previously reported for the respective wild type exons. This behaviour may be related to effect of the mutations on exon skipping propensity and highlights the complexity of identifying optimal AONs for skipping exons with small mutations. In the second part we compared different techniques to reveal the exon skipping levels in the muscles of 7 different mdx mice. An absolute quantification of the dystrophin transcript amount was possible using a digital array. Results underline the low expression of the dytrophin gene and the amount needed to correctly quantify the exon skipping percentage

    High-throughput and miniaturized resin reuse studies

    Get PDF
    The major process limitations of current antibody purification processes are posed by affinity chromatography. Protein A-based chromatography can account for more than 70% of downstream processing costs due to resin throughput, cost and complexity of scale up. A standard industrial practice to minimize resin cost has been to recycle its use over an extended period, aiming for \u3e200 cycles. Results from a recent survey suggest that the number of Protein A cycles currently used for in the industry is much lower typically between 50–100 cycles (Rathore et al., 2015). Impurities that have not been removed may cause carryover from one cycle to the next, reducing the lifetime use of the resin, and therefore, effective cleaning of the resin becomes an important factor in reuse lifetime. However, the screening of a large number of cleaning conditions may not be practical or economically feasible at laboratory scale due to the large amount of feedstream and resources required for each experiment (10-100 mL). To overcome this issue, techniques that can generate data with minimal resource expenditure by mimicking large scale processes can be invaluable. Typically, automating such studies at microscale poses great challenges due to the non-geometric scalability of the microscale columns with the lack of online UV monitoring. In this presentation we will show how automated high throughput resin reuse studies performed on microscale columns can preempt potential issues with cleaning, feedstream, yield and chromatographic profile prior to scale up whilst requiring minimal human resources and material (20-2000 μL). Data from screening 50 different cleaning solutions in single and in combination on 600 μL microscale columns whilst using only 1.5% of laboratory scale feed material will be presented. Furthermore, a comparison to the performance of bench scale columns (4.7 mL) will be discussed. Although automated high-throughput and miniaturized chromatographic process development relying on microscale columns is widespread, we believe this to be the first report of successful miniaturization of resin cleaning and reuse studies. Rathore, A., Pathak, M., Ma, G., Bracewell, D., (2015) Re-use of protein A resin: fouling and economics. Biopharm Int. 3; 2

    Automated high-throughput and miniaturized semi-continuous chromatography

    Get PDF
    The major process limitations of current antibody purification processes are posed by affinity chromatography, although purification platforms based on affinity chromatography are very effective. Typically, protein A-based chromatography can account for more than 70% of downstream processing costs due to resin throughput, cost and complexity of scale up. Thus, there has been increased focus by the industry on developing and implementing continuous chromatography technology to increase resin capacity, reduce buffer consumption and increase productivity of packed bed steps. At UCB we have publicly presented a novel semi-continuous operation that can be operated on an unmodified chromatography skid named SCRAM (Sequential Chromatography Recycling with Asynchronous Multiplexing), which replicates the functionality and capacity gain of traditional continuous systems without the complexity. However, increasingly new innovative antibody formats have resulted in significant process platform adaptations to be performed prior to manufacture, and therefore the screening of many conditions to find a suitable window of operation may not be economically feasible at laboratory scale due to the amount of feedstream and resources required for each experiment. To overcome this issue, techniques that can generate data with minimal resource expenditure can be invaluable in early bioprocess development. Automated microscale platforms offer a change in bioprocess development by accelerating process development due to the flexibility for parallel experimentation and automation while requiring microscale quantities of material. In an industry first, we will demonstrate the application of SCRAM using 600 uL microscale columns on an automated robotic platform performed in parallel to explore large experimental design spaces with minimal resource expenditure. This has resulted in critical bioprocess information to be obtained earlier in development providing a better opportunity to understand process parameters and robustness understanding of this application. Therefore, this approach can be a viable and valuable alternative route for identifying sweet spots during screening studies in bioprocess development. Within the sector, automated high-throughput and miniaturised chromatographic process development relying on microscale columns is widespread, however, we believe this to be the first report of successful miniaturization of semi-continuous chromatography using microscale columns

    The simplex algorithm in an automated high-throughput approach for the rapid screening of operating conditions during process understanding and development

    Get PDF
    The screening of a large number of conditions to find a suitable window of operation may not be economically feasible at laboratory scale due to the large amount of feedstream and resources required for each experiment (10-100 mL). To overcome this issue, techniques that can generate data with minimal resource expenditure can be invaluable in early bioprocess development. Microscale platforms offer a change in bioprocess development by accelerating process development due to the flexibility for parallel experimentation and automation while requiring microscale quantities of material (20-2000 L). Critical bioprocess information can be obtained earlier in development providing a better opportunity to understand process parameters and for improved understanding of the robustness of each processing step. This study utilises the Simplex algorithm combined with an automated high throughput buffer preparation technique as an alternative to conventional experimental approaches for identifying viable operating conditions during early bioprocess development. In general, conventional experimental methods involve performing a large number of experiments in a given design space. This can be undesirable if a significant percentage of the conditions tested are unfavourable when feedstock and analytical resources are limited. The Simplex algorithm directs each experimental condition towards feasible regions by using the knowledge gained from each condition to direct the choice of subsequent test locations. This study describes the application of the Simplex algorithm for AEX studies performed in a 96 well membrane plate format. The effect of pH and salt concentration on clearing Host Cell Proteins (HCP) from partially purified IgG feedstream was investigated. The Simplex algorithm results were compared to conventional screening studies with response surface modelling. These models suggested that additional experimentation was required to confirm the robust regions of the initial design space. By comparison, the Simplex algorithm identified a good operating point using 70% fewer conditions for HCP clearance. Therefore, this approach can be a viable and valuable alternative route for identifying sweet spots during screening studies in bioprocess development

    Reactor Design for Continuous Monoclonal Antibody Precipitation Based Upon Micro‐mixing

    Get PDF
    BACKGROUND: Precipitation has been applied for the processing of important therapeutics, including monoclonal antibodies (mAbs). The scale‐up has proven to be a challenging task due to the complexity of the reactions and transport processes involved. This requires a good understanding of the molecular processes underpinning precipitate formation. The aim of this study was to build a micro‐mixing model for the precipitation of a mAb in continuous tubular reactors using ammonium sulphate. The effect of micro‐mixing on precipitate formation (with respect to size, strength, and nature) was evaluated. An ultra scale‐down (USD) centrifugation methodology was applied to determine the ease of precipitate clarification. RESULTS: The results demonstrated that the final mean particle size decreased with increased micro‐mixing, and was obtained with short residence times. Antibody yields in the tubular reactors were consistently above 90% and were shown to be independent of the mixing. Similar particle sizes between a lab and pilot‐scale reactor were correlated with the average energy dissipation rate. The smaller particles obtained from improved micro‐mixing had higher fractal dimensions that correlated with minimal breakage upon exposure to turbulent shear. Precipitates were easily clarified at the USD scale (> 95% clarification), but less so at pilot‐scale (< 80% clarification). CONCLUSION: Precipitation is a rapid process where the final precipitate properties are controlled by the flow conditions. Therefore, the process can be manipulated to acquire a certain particle size range. A high‐throughput precipitation process is also possible. However, further investigation into large‐scale precipitate recovery is required. © 2020 Society of Chemical Industr

    Negative Binomial mixed models estimated with the maximum likelihood method can be used for longitudinal RNAseq data

    Get PDF
    Time-course RNAseq experiments, where tissues are repeatedly collected from the same subjects, e.g. humans or animals over time or under several different experimental conditions, are becoming more popular due to the reducing sequencing costs. Such designs offer the great potential to identify genes that change over time or progress differently in time across experimental groups. Modelling of the longitudinal gene expression in such time-course RNAseq data is complicated by the serial correlations, missing values due to subject dropout or sequencing errors, long follow up with potentially non-linear progression in time and low number of subjects. Negative Binomial mixed models can address all these issues. However, such models under the maximum likelihood (ML) approach are less popular for RNAseq data due to convergence issues (see, e.g. [1]). We argue in this paper that it is the use of an inaccurate numerical integration method in combination with the typically small sample sizes which causes such mixed models to fail for a great portion of tested genes. We show that when we use the accurate adaptive Gaussian quadrature approach to approximate the integrals over the random-effects terms, we can successfully estimate the model parameters with the maximum likelihood method. Moreover, we show that the boostrap method can be used to preserve the type I error rate in small sample settings. We evaluate empirically the small sample properties of the test statistics and compare with state-of-the-art approaches. The method is applied on a longitudinal mice experiment to study the dynamics in Duchenne Muscular Dystrophy. Contact:[email protected] Tsonaka is an assistant professor at the Medical Statistics, Department of Biomedical Data Sciences, Leiden University Medical Center. Her research focuses on statistical methods for longitudinal omics data. Pietro Spitali is an assistant professor at the Department of Human Genetics, Leiden University Medical Center. His research focuses on the identification of biomarkers for neuromuscular disorders.Development and application of statistical models for medical scientific researc

    Much-efficient and cost-effective manufacturing of antibody biotherapeutics employing integrated negative chromatography technology

    Get PDF
    New approaches for fully connected and integrated downstream processes to reduce costs and improve efficiency are being assessed with the implementation of the NCAP Project (negative chromatography antibody purification). This project aims to resolve the manufacturing bottleneck facing modern antibody bio-therapeutics through exploring the great potential of the negative chromatography technology, i.e. purifying antibodies by binding all the surrounding impurities instead of binding target antibodies. High-throughput, miniaturised technologies have been implemented to enable the screening of multiple novel ligands based on a custom agarose backbone. The objectives are: (1) replace the conventional expensive and fragile protein-A affinity chromatography medium with inexpensive and more robust small-ligand-based media; (2) investigate novel downstream processes incorporating as many negative chromatography steps as possible to achieve much-efficient and capacity-unlimited manufacturing of biopharmaceuticals; (3) upstream and downstream process integration, and intensification by pushing the boundaries of the negative chromatography technology. This process is independent of the expression level in the upstream and should bring enormous potential cost benefits; providing a platform for truly continuous and integrated manufacturing processes, reducing hold times, enabling faster throughput and reducing the cost of raw materials

    Much-efficient and cost-effective manufacturing of antibody biotherapeutics employing integrated negative chromatography technology

    Get PDF
    New approaches for fully connected and integrated downstream processes to reduce costs and improve efficiency are being assessed with the implementation of the NCAP Project (negative chromatography antibody purification). This project aims to resolve the manufacturing bottleneck facing modern antibody bio-therapeutics through exploring the great potential of the negative chromatography technology, i.e. purifying antibodies by binding all the surrounding impurities instead of binding target antibodies. High-throughput, miniaturised technologies have been implemented to enable the screening of multiple novel ligands based on a custom agarose backbone. The objectives are: (1) replace the conventional expensive and fragile protein-A affinity chromatography medium with inexpensive and more robust small-ligand-based media; (2) investigate novel downstream processes incorporating as many negative chromatography steps as possible to achieve much-efficient and capacity-unlimited manufacturing of biopharmaceuticals; (3) upstream and downstream process integration, and intensification by pushing the boundaries of the negative chromatography technology. This process is independent of the expression level in the upstream and should bring enormous potential cost benefits; providing a platform for truly continuous and integrated manufacturing processes, reducing hold times, enabling faster throughput and reducing the cost of raw materials

    Penalized regression calibration: a method for the prediction of survival outcomes using complex longitudinal and high-dimensional data

    Get PDF
    Longitudinal and high-dimensional measurements have become increasingly common in biomedical research. However, methods to predict survival outcomes using covariates that are both longitudinal and high-dimensional are currently missing. In this article, we propose penalized regression calibration (PRC), a method that can be employed to predict survival in such situations. PRC comprises three modeling steps: First, the trajectories described by the longitudinal predictors are flexibly modeled through the specification of multivariate mixed effects models. Second, subject-specific summaries of the longitudinal trajectories are derived from the fitted mixed models. Third, the time to event outcome is predicted using the subject-specific summaries as covariates in a penalized Cox model. To ensure a proper internal validation of the fitted PRC models, we furthermore develop a cluster bootstrap optimism correction procedure that allows to correct for the optimistic bias of apparent measures of predictiveness. PRC and the CBOCP are implemented in the R package pencal, available from CRAN. After studying the behavior of PRC via simulations, we conclude by illustrating an application of PRC to data from an observational study that involved patients affected by Duchenne muscular dystrophy, where the goal is predict time to loss of ambulation using longitudinal blood biomarkers.Comment: The article is now published in Statistics in Medicine (with Open Access
    corecore