44 research outputs found

    Editorial: The European Symposium on Biochemical Engineering Sciences, Dublin 2016

    Get PDF
    info:eu-repo/semantics/publishedVersio

    Accelerated bioprocess characterization by data enrichment in scale-down models

    Get PDF
    Scale-down models are often used for the definition of operating ranges in process development and validation and therefore data exploitation is an important task. Usually, the experimental scientists have very tight timelines and consequentially may often only perform the required routine of cleaning and sorting the data without more sophisticated analyses, even though it might improve data quality. This problem is more exacerbated by the rise of fully automated, miniaturized high-throughput equipment in up- and downstream process development where data processing automation is not just optional but a must. However, cleaned, sorted and time-aligned data alone do not guarantee a sufficient representation of the process. Often, further data enrichment to extract relevant process parameters is required but omitted due to time constraints. The benefit of data enrichment is true process understanding [1]: the calculation of scalable rates and yields, minima, maxima, median or derivatives of measured online or offline signals and the categorization into clone, lot, feeding strategy, seed train fitness, feed or media can be easily used to explain variation in the current process. Multivariate regression methods such as partial least squares regression (PLS-R) or hybrid models [2] can then be used to explain the contribution and ranking of critical process parameters (CPPs) such as pH, pO2, temperature, metabolite concentrations or metabolic rates towards particular critical quality attributes (CQAs), for instance glycoform distribution, other product quality attributes or cell growth. The knowledge of these parameters can then feed directly into the generation of a statistically verified design space which may be then used for process scale-up and validation [3,4]. Summarizing this contribution, we present a methodology to automate data enrichment. Our suggested data enrichment concept is exemplified shown on upstream micro bioreactor data and comprise one of the necessary steps to characterize and ultimately qualify scale-down process models. This project has received funding from the European Union‘s Horizon2020 research and innovation programme under the Marie Curie SkƂodowska-Curie grant agreement No 643056

    Establishing a robust two-step cloning strategy for the generation of cell lines with a high probability of monoclonality

    Get PDF
    A regulatory requirement for the production of therapeutic proteins from mammalian cells is that the production cell line is clonal, that is, derived from a single progenitor cell. It is therefore standard procedure to include at least one cloning step during the development of a recombinant cell line for therapeutic protein production. Numerous techniques can be employed for cloning cell lines, but regardless of the cloning method used there should be appropriate evidence to support that the method is fit for purpose. A point highlighted by the increasing interest from regulatory bodies regarding the cloning method used and the probability of monoclonality (P(monoclonality)) achieved during cell line development (CLD). FUJIFILM Diosynth Biotechnologies have thoroughly considered the cloning approach used during CLD: A two-step cloning strategy employed which combines the ClonePixℱ as a cloning and screening tool followed by a second cloning step using the industrially accepted method of limiting dilution cloning will be discussed. A collaboration with statisticians led to the development of a method to estimate the resulting P(monoclonality) of cell lines generated using the ClonePixℱ and experimental data to support this statistical method was generated, thereby ensuring that the ClonePixℱ cloning step is robust. We will highlight the challenges of using the ClonePixℱ for a single round of cloning and the advantages of combining it with a second cloning step. We will demonstrate how we achieve a minimum probability of monoclonality of ≄99.78% and typically achieve a P(monoclonality) of 99.9% using a two-step cloning strategy

    Process intensification education contributes to sustainable development goals: Part 2

    Get PDF
    Achieving the United Nations sustainable development goals requires industry and society to develop tools and processes that work at all scales, enabling goods delivery, services, and technology to large conglomerates and remote regions. Process Intensification (PI) is a technological advance that promises to deliver means to reach these goals, but higher education has yet to totally embrace the program. Here, we present practical examples on how to better teach the principles of PI in the context of the Bloom's taxonomy and summarise the current industrial use and the future demands for PI, as a continuation of the topics discussed in Part 1. In the appendices, we provide details on the existing PI courses around the world, as well as teaching activities that are showcased during these courses to aid students’ lifelong learning. The increasing number of successful commercial cases of PI highlight the importance of PI education for both students in academia and industrial staff.We acknowledge the sponsors of the Lorentz’ workshop on“Educating in PI”: The MESA+Institute of the University of Twente,Sonics and Materials (USA) and the PIN-NL Dutch Process Intensi-fication Network. DFR acknowledges support by The Netherlands Centre for Mul-tiscale Catalytic Energy Conversion (MCEC), an NWO Gravitationprogramme funded by the Ministry of Education, Culture and Sci-ence of the government of The Netherlands. NA acknowledges the Deutsche Forschungsgemeinschaft (DFG)- TRR 63šIntegrierte Chemische Prozesse in flĂŒssigen Mehrphasen-systemenš(Teilprojekt A10) - 56091768. The participation by Robert Weber in the workshop and thisreport was supported by Laboratory Directed Research and Devel-opment funding at Pacific Northwest National Laboratory (PNNL).PNNL is a multiprogram national laboratory operated for theUS Department of Energy by Battelle under contract DE-AC05-76RL0183

    Process intensification education contributes to sustainable development goals : part 1

    No full text
    In 2015 all the United Nations (UN) member states adopted 17 sustainable development goals (UN-SDG) as part of the 2030 Agenda, which is a 15-year plan to meet ambitious targets to eradicate poverty, protect the environment, and improve the quality of life around the world. Although the global community has progressed, the pace of implementation must accelerate to reach the UN-SDG time-line. For this to happen, professionals, institutions, companies, governments and the general public must become cognizant of the challenges that our world faces and the potential technological solutions at hand, including those provided by chemical engineering. Process intensification (PI) is a recent engineering approach with demonstrated potential to significantly improve process efficiency and safety while reducing cost. It offers opportunities for attaining the UN-SDG goals in a cost-effective and timely manner. However, the pedagogical tools to educate undergraduate, graduate students, and professionals active in the field of PI lack clarity and focus. This paper sets out the state-of-the-art, main discussion points and guidelines for enhanced PI teaching, deliberated by experts in PI with either an academic or industrial background, as well as representatives from government and specialists in pedagogy gathered at the Lorentz Center (Leiden, The Netherlands) in June 2019 with the aim of uniting the efforts on education in PI and produce guidelines. In this Part 1, we discuss the societal and industrial needs for an educational strategy in the framework of PI. The terminology and background information on PI, related to educational implementation in industry and academia, are provided as a preamble to Part 2, which presents practical examples that will help educating on Process Intensification

    Sustainability in chemical engineering curriculum

    No full text

    Normalisation of Multicondition cDNA Macroarray Data

    No full text
    Background. Normalisation is a critical step in obtaining meaningful information from the high-dimensional DNA array data. This is particularly important when complex biological hypotheses/questions, such a functional analysis and regulatory interactions within biological systems, are investigated. A nonparametric, intensity-dependent normalisation method based on global identification of self-consistent set (SCS) of genes is proposed here for such systems. Results. The SCS normalisation is introduced and its behaviour demonstrated for a range of user-defined parameters affecting sits performance. It is compared to a standard global normalisation method in terms of noise reduction and signal retention. Conclusions. The SCS normalisation results using 16 macroarray data sets from a Bacillus subtilis experiment confirm that the method is capable of reducing undesirable experimental variation whilst retaining important biological information. The ease and speed of implementation mean that this method can be easily adapted to other multicondition time/strain series single colour array data
    corecore