285 research outputs found

    Scalable quantum tomography in a photonic chip

    Full text link
    © 2017 IEEE. We formulate a method of quantum tomography that scales linearly with the number of photons and involves only one optical transformation. We demonstrate it experimentally for two-photon entangled states using a special photonic chip

    Optimisation-based Framework for Resin Selection Strategies in Biopharmaceutical Purification Process Development

    Get PDF
    This work addresses rapid resin selection for integrated chromatographic separations when conducted as part of a high-throughput screening (HTS) exercise during the early stages of purification process development. An optimisation-based decision support framework is proposed to process the data generated from microscale experiments in order to identify the best resins to maximise key performance metrics for a biopharmaceutical manufacturing process, such as yield and purity. A multiobjective mixed integer nonlinear programming (MINLP) model is developed and solved using the ε-constraint method. Dinkelbach's algorithm is used to solve the resulting mixed integer linear fractional programming (MILFP) model. The proposed framework is successfully applied to an industrial case study of a process to purify recombinant Fc Fusion protein from low molecular weight and high molecular weight product related impurities, involving two chromatographic steps with 8 and 3 candidate resins for each step, respectively. The computational results show the advantage of the proposed framework in terms of computational efficiency and flexibility. This article is protected by copyright. All rights reserved

    Predicting performance of constant flow depth filtration using constant pressure filtration data

    Get PDF
    This paper describes a method of predicting constant flow filtration capacities using constant pressure datasets collected during the purification of several monoclonal antibodies through depth filtration. The method required characterisation of the fouling mechanism occurring in constant pressure filtration processes by evaluating the best fit of each of the classic and combined theoretical fouling models. The optimised coefficients of the various models were correlated with the corresponding capacities achieved during constant flow operation at the specific pressures performed during constant pressure operation for each centrate. Of the classic and combined fouling models investigated, the Cake-Adsorption fouling model was found to best describe the fouling mechanisms observed for each centrate at the various different pressures investigated. A linear regression model was generated with these coefficients and was shown to predict accurately the capacities at constant flow operation at each pressure. This model was subsequently validated using an additional centrate and accurately predicted the constant flow capacities at three different pressures (0.69, 1.03 and 1.38 bar). The model used the optimised Cake-Adsorption model coefficients that best described the flux decline during constant pressure operation. The proposed method of predicting depth filtration performance proved to be faster than the traditional approach whilst requiring significantly less material, making it particularly attractive for early process development activities

    Tunable entangled photon states from a nonlinear directional coupler

    Full text link
    Integrated optical platforms enable the realization of complex quantum photonic circuits for a variety of applications including quantum simulations, computations, and communications. The development of on-chip integrated photon sources, providing photon quantum states with on-demand tunability, is currently an important research area. A flexible approach for on-chip generation of entangled photons is based on spontaneous nonlinear frequency conversion, with possibilities to integrate several photon-pair sources [1] and realize subsequent post processing using thermo-optically or electro-optically controlled interference [2, 3]. However, deterministic postprocessing can only provide a limited set of output states, whereas quantum gates with probabilistic operation are needed to generate arbitrary two-photon states [4]

    Performance-based social comparisons in humans and long-tailed macaques

    Get PDF
    Social comparisons are a fundamental feature of human thinking and affect self-evaluations and task performance. Little is known about the evolutionary origins of social comparison processes, however. Previous studies that investigated performance-based social comparisons in nonhuman primates yielded mixed results. We report three experiments that aimed (a) to explore how the task type may contribute to performance in monkeys, and (b) how a competitive set-up affects monkeys compared to humans. In a co-action touchscreen task, monkeys were neither influenced by nor interested in the performance of the partner. This may indicate that the experimental set-up was not sufficiently relevant to trigger social comparisons. In a novel co-action foraging task, monkeys increased their feeding speed in competitive and co-active conditions, but not in relation to the degree of competition. In an analogue of the foraging task, human participants were affected by partner performance and experimental context, indicating that the task is suitable to elicit social comparisons in humans. Our studies indicate that specifics of task and experimental setting are relevant to draw the monkeys’ attention to a co-actor and that, in line with previous research, a competitive element was crucial. We highlight the need to explore what constitutes “relevant” social comparison situations for monkeys as well as nonhuman animals in general, and point out factors that we think are crucial in this respect (e.g. task type, physical closeness, and the species’ ecology). We discuss that early forms of social comparisons evolved in purely competitive environments with increasing social tolerance and cooperative motivations allowing for more fine-grained processing of social information. Competition driven effects on task performance might constitute the foundation for the more elaborate social comparison processes found in humans, which may involve context-dependent information processing and metacognitive monitoring

    Integration of host strain bioengineering and bioprocess development using ultra-scale down studies to select the optimum combination: An antibody fragment primary recovery case study.

    Get PDF
    An ultra scale-down primary recovery sequence was established for a platform E. coli Fab production process. It was used to evaluate the process robustness of various bioengineered strains. Centrifugal discharge in the initial dewatering stage was determined to be the major cause of cell breakage. The ability of cells to resist breakage was dependant on a combination of factors including host strain, vector, and fermentation strategy. Periplasmic extraction studies were conducted in shake flasks and it was demonstrated that key performance parameters such as Fab titre and nucleic acid concentrations were mimicked. The shake flask system also captured particle aggregation effects seen in a large scale stirred vessel, reproducing the fine particle size distribution that impacts the final centrifugal clarification stage. The use of scale-down primary recovery process sequences can be used to screen a larger number of engineered strains. This can lead to closer integration with and better feedback between strain development, fermentation development and primary recovery studies. Biotechnol. Bioeng. © 2014 Wiley Periodicals, Inc

    Quantitative high throughput analytics to support polysaccharide production process development.

    Get PDF
    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The assays help build an analytical foundation to support the advent of HTPD technology for polysaccharide vaccines. It is envisaged that this will lead to an expanded use of Quality by Design (QbD) studies in vaccine process development

    Phase-field modeling of microstructural pattern formation during directional solidification of peritectic alloys without morphological instability

    Full text link
    During the directional solidification of peritectic alloys, two stable solid phases (parent and peritectic) grow competitively into a metastable liquid phase of larger impurity content than either solid phase. When the parent or both solid phases are morphologically unstable, i.e., for a small temperature gradient/growth rate ratio (G/vpG/v_p), one solid phase usually outgrows and covers the other phase, leading to a cellular-dendritic array structure closely analogous to the one formed during monophase solidification of a dilute binary alloy. In contrast, when G/vpG/v_p is large enough for both phases to be morphologically stable, the formation of the microstructurebecomes controlled by a subtle interplay between the nucleation and growth of the two solid phases. The structures that have been observed in this regime (in small samples where convection effect are suppressed) include alternate layers (bands) of the parent and peritectic phases perpendicular to the growth direction, which are formed by alternate nucleation and lateral spreading of one phase onto the other as proposed in a recent model [R. Trivedi, Metall. Mater. Trans. A 26, 1 (1995)], as well as partially filled bands (islands), where the peritectic phase does not fully cover the parent phase which grows continuously. We develop a phase-field model of peritectic solidification that incorporates nucleation processes in order to explore the formation of these structures. Simulations of this model shed light on the morphology transition from islands to bands, the dynamics of spreading of the peritectic phase on the parent phase following nucleation, which turns out to be characterized by a remarkably constant acceleration, and the types of growth morphology that one might expect to observe in large samples under purely diffusive growth conditions.Comment: Final version, minor revisions, 16 pages, 14 EPS figures, RevTe

    High throughput process development workflow with advanced decision-support for antibody purification

    Get PDF
    Chromatography remains the workhorse in antibody purification; however process development and characterisation still require significant resources. The high number of operating parameters involved requires extensive experimentation,traditionally performed at small- and pilot-scale, leading to demands in terms of materials and time that can be a challenge. The main objective of this research was the establishment of a novel High Throughput Process Development (HTPD) workflow combining scale-down chromatography experimentation with advanced decision-support techniques in order to minimise the consumption of resources and accelerate the development timeframe. Additionally, the HTPD workflow provides a framework to rapidly manipulate large datasets in an automated fashion. The central component oftheHTPD workflow is the systematic integrationof amicroscale chromatography experimentation strategy with an advanced chromatogram evaluation method, design of experiments (DoE) and multivariate data analysis. The outputs ofthis are leveraged into the screening and optimisation components of the workflow. For the screening component, a decision-support tool was developed combining different multi-criteria decision-making techniques to enable a fair comparison of a number of CEX resin candidates and determine those that demonstrate superior purification performance. This provided a rational methodology for screening chromatography resins and process parameters. For the optimisation component, the workflow leverages insights provided through screening experimentation to guide subsequent DoE experiments so as to tune significant process parameters for the selected resin. The resulting empirical correlations are linked to a stochastic modelling technique so as to predict the optimal and most robust chromatographic process parameters to achieve the desired performance criteria

    Performance-based social comparisons in humans and long-tailed macaques

    Get PDF
    Social comparisons are a fundamental feature of human thinking and affect self-evaluations and task performance. Little is known about the evolutionary origins of social comparison processes, however. Previous studies that investigated performance-based social comparisons in nonhuman primates yielded mixed results. We report three experiments that aimed (a) to explore how the task type may contribute to performance in monkeys, and (b) how a competitive set-up affects monkeys compared to humans. In a co-action touchscreen task, monkeys were neither influenced by nor interested in the performance of the partner. This may indicate that the experimental set-up was not sufficiently relevant to trigger social comparisons. In a novel co-action foraging task, monkeys increased their feeding speed in competitive and co-active conditions, but not in relation to the degree of competition. In an analogue of the foraging task, human participants were affected by partner performance and experimental context, indicating that the task is suitable to elicit social comparisons in humans. Our studies indicate that specifics of task and experimental setting are relevant to draw the monkeys’ attention to a co-actor and that, in line with previous research, a competitive element was crucial. We highlight the need to explore what constitutes “relevant” social comparison situations for monkeys as well as nonhuman animals in general, and point out factors that we think are crucial in this respect (e.g. task type, physical closeness, and the species’ ecology). We discuss that early forms of social comparisons evolved in purely competitive environments with increasing social tolerance and cooperative motivations allowing for more fine-grained processing of social information. Competition driven effects on task performance might constitute the foundation for the more elaborate social comparison processes found in humans, which may involve context-dependent information processing and metacognitive monitoring
    • …
    corecore