1,867 research outputs found

    Evaluation and development of strategies for pooling in preparative chromatography

    Get PDF
    Computer simulation and modelling was used to simulate a real time environment in preparative chromatography to evaluate the performance of three different strategies for pooling control for different levels of robustness. The pooling problem was based around a separation case where three different insulin species were to be separated while disturbances to the modulators potassium chloride, ethanol and sample load could be imposed on the system. The simulation was created with the assumption that the only measurement available would be the UV absorbance at the process outlet. The first strategy implemented was a time based method where the cut points were determined by offline optimization and would then be static. The second strategy started with offline optimization which was used to determine the UV absorbance measurement at the optimal cut points, these UV absorbances were then used to determine cut placement in real time simulation. The last strategy was a predictive method which made estimations of the concentration profiles in the column based on the gathered measurements and subsequently used these estimations to continuously make new optimized pooling decisions in the real time simulation. The parameters investigated for the evaluation of the performance were the process yield, purity and number of batch failures due to unmet purity requirements. The time based strategy showed the best performance when only load disturbances were present and the prediction based strategy showed the best performance when only disturbances to the modulators were present. The UV based strategy had a large percentage of batch failures for all disturbance cases, the strategy only had moderate success at the highest levels of robustness used in this thesis. Results also indicate that the type of disturbance distribution used could play a part in which strategy shows the best performance. The predictive strategy fared better in cases where latin hypercube sampling was used for the disturbance distribution while the time based strategy showed better performance for a normal random disturbance distribution.I detta examensarbete undersöktes olika strategier för att kontrollera upptag av produkter frĂ„n kromatografisk separation. Prestandan för dessa strategier jĂ€mfördes Ă€ven med en nyutvecklad strategi vars kontrollbeslut baserades pĂ„ förutsĂ€gelser. Att utveckla nya lĂ€kemedel Ă€r en kostsam process samtidigt som bara 15-30% av de produkter som utvecklas blir godkĂ€nda för lansering. Detta medför höga krav pĂ„ att resten av produktionskedjan Ă€r kostnadseffektiv. PĂ„ grund av höga krav pĂ„ kvalitĂ©t och renhet sĂ„ stĂ„r rening för en stor del av kostnaderna i lĂ€kemedelsproduktion. En av de mest anvĂ€nda metoderna för rening av lĂ€kemedel Ă€r kromatografisk separation. Kromotografi anvĂ€nds i syfte att separera olika Ă€mnen i en lösning frĂ„n varandra och processen skulle kunna liknas vid att anordna ett lopp mellan löpare, cyklister och bilister pĂ„ olika banor. Om banan Ă€r en motorvĂ€g sĂ„ kan det vĂ€ntas att alla bilister kommer i mĂ„l först följt av alla cyklister och sen alla löpare sist. Är det en vĂ€ldigt kort bana sĂ„ kommer inte bilisterna hinna fĂ„ sĂ„ stort försprĂ„ng och vissa bilar kanske till och med kommer i mĂ„l efter nĂ„gra av löparna. Om banan dĂ€remot Ă€r lĂ„ng kommer uppdelningen dĂ€remot bli vĂ€ldigt tydlig. Det Ă€r denna typ av uppdelning man vill uppnĂ„ i en kromatografisk process. Om olika Ă€mnen generellt kommer i mĂ„l vid olika tider sĂ„ kan man hitta ett tidsintervall dĂ„ endast en typ av Ă€mne kommer ut och pĂ„ sĂ„ vis fĂ„ en ren produkt. Valet av det hĂ€r tidsintervallet kallas poolning och det jag har undersökt i mitt examensarbete Ă€r hur det tidsintervallet kan styras för att maximera den mĂ€ngd produkt man kan fĂ„ ut utan att pĂ„verka renheten negativt. Det Ă€r nĂ€mligen sĂ„ att i ett kromatografiskt system sĂ„ kan det bara observeras hur loppet mellan de olika Ă€mnena gĂ„r precis vid mĂ„lgĂ„ngen och oftast kan man bara mĂ€ta hur mycket som gĂ„r i mĂ„l men inte vilken typ de tillhör. Detta betyder att om ovĂ€ntade saker hĂ€nder pĂ„ banan sĂ„ att alla tĂ€vlande inte kommer ut vid de vĂ€ntade tiderna, sĂ„ kan det vara svĂ„rt att bestĂ€mma i vilket tidsintervall som produkt ska plockas ut. Det finns sĂ„ledes ett behov av att ha ett bra kontrollsystem för att undvika beslut som leder till dĂ„lig produktkvalitet och slöseri pĂ„ rĂ„varor. De olika strategierna provades för olika typer av ovĂ€ntade hĂ€ndelser för att undersöka hur vĂ€l de kunde hanteras. Det visade sig att ingen av strategierna var enskilt bĂ€st för alla fall. Olika strategier klarade av att hantera olika typer av ovĂ€ntade hĂ€ndelser med varierande resultat. Att veta hur olika strategier klarar av att bestĂ€mma tidsintervallet för att plocka ut produkt frĂ„n en kromatografisk process Ă€r vĂ€rdefullt för att kunna maximera lönsamheten i sin produktion. I mitt examensarbete undersökte jag tre olika strategier för att bestĂ€mma detta intervall. Den första strategin tittade pĂ„ ett standardfall för mĂ„lgĂ„ng och bestĂ€mde tidsintervallet baserat pĂ„ det. Den andra strategin tittade pĂ„ standardfallet men istĂ€llet för att bara ta tidsintervallet rakt av sĂ„ kollade den pĂ„ hur mycket av Ă€mnena som uppmĂ€ttes precis vid Ă€ndpunkterna i tidsintervallet och anvĂ€nda dessa för att bestĂ€mma nya tidsintervall för det riktiga fallet. Den sista strategin som provades anvĂ€nde mĂ€tningarna pĂ„ mĂ€ngden som gick i mĂ„l för att gissa hur loppet gick under tiden loppet var igĂ„ng. Baserat pĂ„ dessa gissningar försökte den aktivt hitta det bĂ€sta tidsintervallet under loppets gĂ„ng

    Integrated starch and lignocellulose based biorefineries : Synergies and opportunities

    Get PDF
    The transition from a reliance on fossil resources to the use of renewables for the production of energy, fuels and chemicals is essential for ensuring the sustainability of continued human development. Plant-based biomass is a renewable resource which can be transformed into all of these products. However, biomass is a heterogeneous material composed of several fractions with different chemical properties. Furthermore, the composition varies between species. In order to maximize the environmental and economic sustainability of biomass-based production, production systems that utilize all fractions of biomass to their fullest potential have to be developed. This is the goal of a biorefinery.The work presented in this thesis mainly revolves around biorefineries that utilize feedstocks rich in starch and lignocellulose together to produce ethanol in an integrated process. The work is focused on comparing the performance of stand-alone and integrated biorefineries by investigating the impact that feedstock blending has on parameters important for the process economy, identifying potential synergies from integration and opportunities for improved material utilization.It was found in this work, that the integration of starch- and lignocellulose-based feedstocks could result in improved ethanol productivity and yield during hydrolysis and fermentation compared to a stand-alone lignocellulose process without losing performance compared to a stand-alone starch-based process.The prospects of introducing a sequential fractionation of the lignocellulosic biomass prior to integration was investigated. It was shown that this method could be used to produce separate fractions enriched in cellulose and lignin as well as improving the hydrolyzabilty of the cellulose fraction. This kind of fractionation could facility the utilization of all biomass fractions in both feedstocks by creating new byproduct streams as well as decreasing negative impacts on existing byproduct streams

    Self-energy correction to the hyperfine structure splitting of hydrogenlike atoms

    Get PDF
    A first testing ground for QED in the combined presence of a strong Coulomb field and a strong magnetic field is provided by the precise measurement of the hyperfine structure splitting of hydrogenlike 209Bi. We present a complete calculation of the one-loop self-energy correction to the first-order hyperfine interaction for various nuclear charges. In the low-Z regime we almost perfectly agree with the Z alpha expansion, but for medium and high Z there is a substantial deviation

    Model risk quantification in option pricing

    Get PDF
    This thesis investigates a methodology for quantification of model risk in option pricing. A set of different pricing models is specified and each model is assigned a probability weight based on the Akaike Information Criteria. It is then possible to obtain a price distribution of an exotic derivative from these probability weights. Two measures of model risk inspired by the regulatory standards on prudent valuation are proposed based on this methodology. The model risk measures are studied for different equity options which are priced using a set of stochastic volatility models, with and without jumps. The models are calibrated to vanilla call options from the S&P 500 index, as well as to synthetic option prices based on market data simulated using the Bates model. For comparable options, the model risk is higher for up-and-out barrier options compared to vanilla, digital and Asian options. Moreover, the model risk measure, in relative terms of option price, increases quickly with strike level for call options far out of the money, while the model risk in absolute terms is lowest when the option is deep out of the money. The model risk for up-and-out barrier options tends to be higher when the barrier is closer to the spot price, although the increase in risk does not have to be monotonic with decreasing barrier level. The methodology is flexible and easy to implement, yielding intuitive results. However, it is sensitive to different assumptions in the structure of the pricing errors

    ĐąĐŸ Ń‰ĐŸ ж їсто?

    Get PDF
    Gingivitis and periodontitis are chronic inflammatory diseases that can lead to tooth loss. One of the causes of these diseases is the Gram-negative Porphyromonas gingivalis. This periodontal pathogen is dependent on two fimbriae, FimA and Mfa1, for binding to dental biofilm, salivary proteins, and host cells. These fimbriae are composed of five proteins each, but the fimbriae assembly mechanism and ligands are unknown. Here we reveal the crystal structure of the precursor form of Mfa4, one of the accessory proteins of the Mfa1 fimbria. Mfa4 consists of two ÎČ-sandwich domains and the first part of the structure forms two well-defined ÎČ-strands that run over both domains. This N-terminal region is cleaved by gingipains, a family of proteolytic enzymes that encompass arginine- and lysine-specific proteases. Cleavage of the N-terminal region generates the mature form of the protein. Our structural data allow us to propose that the new N-terminus of the mature protein may function as a donor strand in the polymerization of P. gingivalis fimbriae

    Dual energy X-ray absorptiometry positioning protocols in assessing body composition: A systematic review of the literature:A systematic review of the literature

    Get PDF
    OBJECTIVES: To systematically identify and assess methods and protocols used to reduce technical and biological errors in published studies that have investigated reliability of dual energy X-ray absorptiometry (DXA) for assessing body composition. DESIGN: Systematic review. METHODS: Systematic searches of five databases were used to identify studies of DXA reliability. Two independent reviewers used a modified critical appraisal tool to assess their methodological quality. Data was extracted and synthesised using a level of evidence approach. Further analysis was then undertaken of methods used to decrease DXA errors (technical and biological) and so enhance DXA reliability. RESULTS: Twelve studies met eligibility criteria. Four of the articles were deemed high quality. Quality articles considered biological and technical errors when preparing participants for DXA scanning. The Nana positioning protocol was assessed to have a strong level of evidence. The studies providing this evidence indicated very high test–retest reliability (ICC 0.90–1.00 or less than 1% change in mean) of the Nana positioning protocol. The National Health and Nutrition Examination Survey (NHANES) positioning protocol was deemed to have a moderate level of evidence due to lack of high quality studies. However, the available studies found the NHANES positioning protocol had very high test–retest reliability. Evidence is limited and reported reliability has varied in papers where no specific positioning protocol was used or reported. CONCLUSIONS: Due to the strong level of evidence of excellent test–retest reliability that supports use of the Nana positioning protocol, it is recommended as the first choice for clinicians when using DXA to assess body composition

    Uncertainty-Aware CNNs for Depth Completion: Uncertainty from Beginning to End

    Full text link
    The focus in deep learning research has been mostly to push the limits of prediction accuracy. However, this was often achieved at the cost of increased complexity, raising concerns about the interpretability and the reliability of deep networks. Recently, an increasing attention has been given to untangling the complexity of deep networks and quantifying their uncertainty for different computer vision tasks. Differently, the task of depth completion has not received enough attention despite the inherent noisy nature of depth sensors. In this work, we thus focus on modeling the uncertainty of depth data in depth completion starting from the sparse noisy input all the way to the final prediction. We propose a novel approach to identify disturbed measurements in the input by learning an input confidence estimator in a self-supervised manner based on the normalized convolutional neural networks (NCNNs). Further, we propose a probabilistic version of NCNNs that produces a statistically meaningful uncertainty measure for the final prediction. When we evaluate our approach on the KITTI dataset for depth completion, we outperform all the existing Bayesian Deep Learning approaches in terms of prediction accuracy, quality of the uncertainty measure, and the computational efficiency. Moreover, our small network with 670k parameters performs on-par with conventional approaches with millions of parameters. These results give strong evidence that separating the network into parallel uncertainty and prediction streams leads to state-of-the-art performance with accurate uncertainty estimates.Comment: CVPR2020 (8 pages + supplementary

    The Cost-effectiveness of Treating Diabetic Lower Extremity Ulcers with Becaplermin (Regranex): A Core Model with an Application Using Swedish Cost Data

    Get PDF
    AbstractObjectivesThe objective of this study was to develop a model capable of assessing the cost-effectiveness in Sweden of treating diabetic neuropathic lower extremity ulcers with becaplermin gel (Regranex) plus good wound care (GWC) relative to treating them with GWC alone.MethodsA Markov simulation model was developed that includes six health states: Uninfected Ulcer, Infected Ulcer, Gangrene, Healed Ulcer, Healed Ulcer-History of Amputation, and Deceased. To predict clinical outcomes, information was taken from a specially designed prospective 9-month follow-up study of 183 neuropathic patients in the US treated with GWC. Cost of treatment data were taken primarily from a study of a cohort of 314 patients in Sweden. The efficacy of becaplermin was assumed equal to that achieved in a pooled analysis of four randomized clinical trials. A model application provides expected clinical outcomes for a cohort of patients. Annual treatment costs per patient were estimated using treatment practice and unit prices from Sweden.ResultsDue to a higher rate of healing and a shorter average healing time, treatment with becaplermin gel was predicted to increase the average number of months spent in the healed state over the first year following development of an ulcer by 24% relative to GWC alone. In addition, the corresponding number of amputations was 9% lower for the becaplermin-treated cohort. The average expected cost of 12,078USforanindividualtreatedwithGWCalonedeclinesto12,078 US for an individual treated with GWC alone declines to 11,708 US for one treated with becaplermin, in spite of $1262 becaplermin costs. Expenses related to topical treatment and inpatient care account for 83% of the resources conserved.ConclusionOur results suggest that in Sweden treatment with becaplermin in conjunction with GWC consumes fewer resources and generates better outcomes than treatment with GWC alone for diabetic neuropathic ulcers. In light of the high and increasing incidence of such ulcers, the potential savings in costs and suffering may be important. Results are difficult to extrapolate internationally because they are strongly related to country-specific treatment practices and price levels
    • 

    corecore