4,957 research outputs found

    FPGA Acceleration of Mean Variance Framework for Optimal Asset Allocation

    Get PDF
    Asset classes respond differently to shifts in financial markets, thus an investor can minimize the risk of loss and maximize return of his portfolio by diversification of assets. Increasing the number of diversified assets in a financial portfolio significantly improves the optimal allocation of different assets giving better investment opportunities. However, a large number of assets require a significant amount of computation that only high performance computing can currently provide. Because of the highly parallel nature of Markowitzpsila mean variance framework (the most popular approximation approach for optimal asset allocation) an FPGA implementation of the framework can also provide the performance necessary to compute the optimal asset allocation with a large number of assets. In this work, we propose an FPGA implementation of Markowitzpsila mean variance framework and show it has a potential performance ratio of 221 times over a software implementation

    Distributed model predictive control of steam/water loop in large scale ships

    Get PDF
    In modern steam power plants, the ever-increasing complexity requires great reliability and flexibility of the control system. Hence, in this paper, the feasibility of a distributed model predictive control (DiMPC) strategy with an extended prediction self-adaptive control (EPSAC) framework is studied, in which the multiple controllers allow each sub-loop to have its own requirement flexibility. Meanwhile, the model predictive control can guarantee a good performance for the system with constraints. The performance is compared against a decentralized model predictive control (DeMPC) and a centralized model predictive control (CMPC). In order to improve the computing speed, a multiple objective model predictive control (MOMPC) is proposed. For the stability of the control system, the convergence of the DiMPC is discussed. Simulation tests are performed on the five different sub-loops of steam/water loop. The results indicate that the DiMPC may achieve similar performance as CMPC while outperforming the DeMPC method

    Towards the rational use of antibiotics: Utilising pharmacometric approaches to improve meropenem and piperacillin treatment in critically ill patients

    Get PDF
    In 1909 the discovery of the antibiotic arsphenamin marked the beginning of a new era in treating potentially deadly bacterial infections. In the following decades, the discovery of various new antibiotic drugs substantially contributed to a rise in life expectancy from 47.0 to 78.8 years in the United States of America. Despite this considerable progress in treating infectious diseases, bacterial infections remain a major threat to public health. Especially vulnerable patient populations, like critically ill patients, continued to suffer under mortality rates up to 60%. Worryingly, the described achievements are threatened by two alarming developments: While no truly novel antibiotic classes have been discovered and developed in the last three decades, the emergence and spread of antimicrobial resistance -accelerated by the inappropriate use of antibiotic drugs - steadily reduces the efficacy of currently available drugs. As a response to this new challenge, several national and international action plans call not only for a determined search for new antimicrobial drugs, but also for a more rational use of existing antibiotics. One vital component of rational antibiotic drug therapy is an adequate drug exposure at the site of infection, facilitated by the selection of suitable antibiotic drug(s) in combination with an appropriate dosing regimen. The antibiotic drug administered to the patient should be selected based on its efficacy against the pathogen causing the infection. Unfortunately, the pathogen causing the infection is often unknown at the start of antibiotic therapy. As a consequence, broad spectrum antibiotics – like meropenem and piperacillin/tazobactam - are frequently administered to increase the likelihood of an effective therapy. The selection of an appropriate dosing regimen can be complicated and is especially challenging in critically ill patients: The broad range of pathophysiological changes observed in this patient population leads to high pharmacokinetic (PK) variability, which results in substantial differences in drug exposures between patients receiving the same antibiotic drug and dosing regimen. Under the concept of model-informed precision dosing (MIPD), population pharmacokinetic/pharmacodynamic models and patient-specific data (e.g. patient characteristics, drug measurement(s)) can be leveraged to inform and improve dosing decisions in this vulnerable patient population. The objective of the presented thesis was the development, implementation and evaluation of MIPD tools for antibiotic drugs in critically ill patients. To enable the successful integration of MIPD into clinical practice an iterative, integrative and translational approach was followed. The initial and central question ’Is the current antibiotic dosing appropriate?’, was iteratively addressed integrating expertise from a diverse interprofessional team of healthcare professionals and can be segmented into four intermediate steps, all vital to the main objective. First, and as a prerequisite both for model development/evaluation and dosing adaptation, the establishment of a reliable and frequent antibiotic concentration measurement program was required. Second, the collected data was analysed employing pharmacometric and statistical methodology to characterise population PK/pharmacodynamics (PD) and local factors influencing antibiotic therapy (e.g. local pathogen susceptibility). Third, the gained scientific knowledge was translated into easy-to-use, model-informed dosing tools and comprehensive dosing strategies optimised for clinical practice. And fourth, the developed model-informed dosing tools were implemented into clinical routine and subsequently evaluated and optimised. This thesis focused on meropenem and the fixed drug combination piperacillin/tazobactam and addressed individual or multiple of these four steps in three different projects. In Project I, a possible adsorption of the antibiotic meropenem at the cytokine adsorber CytoSorb®, its effect on meropenem exposure and possible consequences for an adequate meropenem dosing were investigated. Despite the absence of clear evidence for a beneficial effect on patients outcomes, the CytoSorb® filter is increasingly used to reduce circulating cytokines in patients experiencing sepsis. Due to its unspecific binding and therefore elimination of molecules up to a molar mass of 55 kDa, concerns have been raised that the CytoSorb® filter unintentionally adsorbs various drugs including meropenem. To investigate if meropenem dosing needs to be increased during CytoSorb® treatment, a nonlinear mixed-effects (NLME) modelling and simulation approach was employed: A population pharmacokinetic model was developed and three distinct approaches to assess if meropenem clearance differed without or during CytoSorb® treatment were applied: (i) quantification of a possible proportional increase in clearance during CytoSorb® treatment (ii) investigation of (non)saturable adsorption at the CytoSorb® filter using different adsorption submodels and (iii) model parameter re-estimation excluding samples collected during CytoSorb® treatment and evaluating the predictive performance for meropenem concentrations during CytoSorb® treatment. In contrast to the expectation of meropenem being adsorped at the CytoSorb® filter, no significant (p<0.05) or relevant effect of CytoSorb® treatment on meropenem exposure was observed. Consequently, neither additional dosing nor a more frequent drug concentration monitoring of meropenem is necessary during the application of CytoSorb® therapy. Project II focused on improving meropenem and piperacillin/tazobactam treatment for critically ill patients at the Charité-Universitätsmedizin Berlin. For this purpose, a 3-staged clinical study was initiated as a coordinated intervention. In stage I, a frequent and reliable concentrations measurement program was implemented to evaluate the current antibiotic therapy. The assessment of the current antibiotic therapy provided insights about local pathogen susceptibility, while highlighting the need for dose individualisation based on patient characteristics: The majority (>90%) of observed pathogens were susceptible to the two administered antibiotic drugs, but target range attainment (minimum antibiotic drug concentrations between 1 and 5 times minimum inhibitory concentration (MIC) of the pathogen) was low for the observed drug concentrations (meropenem: 35.7%, piperacillin: 50.5%) and highly variable between patients with different renal functions. To improve initial meropenem dosing (i.e. prior to the first concentration measurement) and to exploit the newly gained information about the local pathogen susceptibility, a tabular model-informed dosing tool was developed and implemented in stage II of the study. For the development of the tool, an appropriate meropenem PK model was selected from literature and successfully evaluated using the local clinical data. The PK model was then used to conduct stochastic simulations investigating clinically relevant dosing regimens, possible clinical scenarios and the probability of the dosing regimens to achieve adequate drug exposures. To inform dosing prior to pathogen identification, the local pathogen-independent mean fraction of response (LPIFR) was introduced: The LPIFR characterises the probability of a dosing regimen to reach a defined target, e.g. time above the MIC, if only the underlying MIC distribution at a hospital and not the individual MIC of the pathogen causing the infection is known. To inform dosing after MIC value determination, probability of target attainment analyses (PTA) were performed. Dosing recommendations achieving PTA>90% or LPIFR>90% for patients with different creatinine clearances (10.0-300 mL/min) were derived and summarised in one concise and clear table. To assess the potential of the newly developed model-informed dosing tool prior to implementation, the total daily dose of the dosing regimens recommended by the dosing tool for the local study population was compared to the total daily dose of the actually administered dosing regimens. For 77% of the patients with meropenem concentrations outside the target range, the dosing tool suggested a change in daily dose, highlighting the potential of the tool to optimise dosing regimens. To integrate patient individual antibiotic drug measurements and allow for more user flexibility, an interactive model-informed dosing software termed ‘DoseCalculator’ was developed for stage III of the study. In addition to the meropenem PK model already evaluated for stage II of the study, different piperacillin/tazobactam models were extracted from literature, evaluated using the local clinical data collected in stage I of the study and the best performing model implemented into the tool. Based on available knowledge about the infection, three possibilities to calculate the probability of a dosing regimen to reach adequate antibiotic exposures were integrated into the tool: (i) the LPIFR if neither the pathogen nor the MIC is available, (ii) the cumulative fraction of response (CFR) based on the MIC distribution of a specific pathogen if the pathogen is available and (iii) the PTA if the MIC is available. Furthermore, employing a maximum a-posterior (MAP) estimation approach the observed antibiotic drug measurement(s) of a patient can be used in the DoseCalculator to derive patient individual parameter estimates. If drug measurement(s) of a patient are supplied, all analyses and the resulting recommended dosing regimen are based on the individual parameter estimates of the patient. Compared to the observed dosing in stage I, the recommendations of the DoseCalculator led to a substantial relative increase in predicted target attainment (322% meropenem, 505% piperacillin) while reducing the daily dose (median reduction: 77.8% meropenem, 83.4% piperacillin). In Project III the MeroRisk Calculator, an easy-to-use Excel tool to determine the risk of meropenem target non-attainment after standard dosing previously developed at our department, was evaluated using clinical routine data. Since the direct evaluation of the MeroRisk Calculator was not feasible with the available retrospective clinical dataset, a two-step data- and model-based evaluation was conducted: In step one, a meropenem PK model was successfully evaluated using the clinical data. In step two, the evaluated PK model was used as a benchmark for the drug concentration and risk predictions of the MeroRisk Calculator. Compared to the successfully evaluated compartmental PK model, the MeroRisk Calculator provided an equally good and reliable risk assessment (Lin’s concordance correlation coefficient = 0.99) for patients with maintained renal function (creatinine clearance > 50 mL/min). However, for patients with creatinine clearances below 50 mL/min significant deviations were observed. As a consequence, the MeroRisk Calculator should not be used in patients with (severe) renal impairment. In addition to the successful evaluation, the functionality of the MeroRisk Calculator was extended. Based on CFR analysis and EUCAST reported MIC value distributions, risk of target non-attainment can now be assessed depending on the infecting pathogen informing dosing decisions prior to MIC value determination. To conclude, the presented thesis contributed to an individualised and more rational antibiotic drug therapy in critically ill patients. While PK modelling was employed in Project I to exclude a clinically relevant adsorption of meropenem at the CytoSorb® filter, Project II and Project III represent a successful example on development, implementation and evaluation of MIPD tools. As a next step, both the tabular model-informed dosing tool and the DoseCalculator should be prospectively evaluated at Charité-Universitätsmedizin Berlin. The results from this evaluation in particular and this thesis demonstrate the potential of MIPD using comprehensive examples on how to develop, implement and evaluate model-informed dosing tools and contribute to the accelerating implementation of MIPD into clinical practice

    Kassiopeia: A Modern, Extensible C++ Particle Tracking Package

    Full text link
    The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease of use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occuring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopei

    Towards virtual machine energy-aware cost prediction in clouds

    Get PDF
    Pricing mechanisms employed by different service providers significantly influence the role of cloud computing within the IT industry. With the increasing cost of electricity, Cloud providers consider power consumption as one of the major cost factors to be maintained within their infrastructures. Consequently, modelling a new pricing mechanism that allow Cloud providers to determine the potential cost of resource usage and power consumption has attracted the attention of many researchers. Furthermore, predicting the future cost of Cloud services can help the service providers to offer the suitable services to the customers that meet their requirements. This paper introduces an Energy-Aware Cost Prediction Framework to estimate the total cost of Virtual Machines (VMs) by considering the resource usage and power consumption. The VMs’ workload is firstly predicted based on an Autoregressive Integrated Moving Average (ARIMA) model. The power consumption is then predicted using regression models. The comparison between the predicted and actual results obtained in a real Cloud testbed shows that this framework is capable of predicting the workload, power consumption and total cost for different VMs with good prediction accuracy, e.g. with 0.06 absolute percentage error for the predicted total cost of the VM

    Improving paper machine clothing supplier's industrial internet offering with artificial intelligence

    Get PDF
    Overall amount of data has grown exponentially during the last few years. The increase in the availability of data has driven companies and countries towards digitalization with growing pace. Therefore, the industrial internet applications have become more successful than ever. These applications provide companies more tools to utilize data-driven decisions. In paper industry, paper machine original equipment manufacturers have started to utilize the industrial internet capabilities with increasing pace. The increasing competition has led to the fact that today, utilization of the possibilities offered by industrial internet is part of target organization’s (Valmet) main strategies. Thus, the paper machine clothing (PMC) unit of Valmet has commissioned this thesis work. The goal of this research was to improve Valmet’s PMC unit’s industrial internet offering. Improvement actions taken were to enhance the existing offering through customer feedback and to provide additional value with artificial intelligence. The approach towards the subject was to find out the existing theory behind the operational context of the fabrics, discover possible developmental actions through prototyping and by creating value-adding AI models to support the offering. During this research process it came evidently clear that the initial industrial internet applications would have good applicability in pilot customer’s daily routines. Though good developmental points were discovered from the prototyping phase, the functionality issues of the initial industrial internet applications during the timeframe of this thesis limited the quality of the feedback. More thorough study for customer feedback should be conducted after the applications have been in daily use for solid amount of time. This research provided two value-adding models for industrial internet applications. The idea for the models sprung from the hopes of the target company. Initially, fabric delivery cycles have been defined more or less by hand. Thus, the Monte Carlo simulation to optimize delivery cycles and to manage risk governing possible shortages was illustrated as the first model. The second model aimed to enhance the first model by conducting estimations of remaining fabric lifetime from customer’s mill’s process data. Neural network was chosen as the machine learning method for this model. Both models were tested with actual process data and the results of the case study were polarized. The simulation model provided valid results and first indications showed that it would bring true added value to the target organization. However, the results of the second model indicated that with available data valid results were not acquired. The results of this study indicate that the artificial intelligence models can be utilized to fabrics industrial internet but more emphasis should be pointed on the comparison of different machine learning methods and to enhance the quality and quantity of the available data
    • …
    corecore