2,342 research outputs found

    A Review of Bayesian Methods in Electronic Design Automation

    Full text link
    The utilization of Bayesian methods has been widely acknowledged as a viable solution for tackling various challenges in electronic integrated circuit (IC) design under stochastic process variation, including circuit performance modeling, yield/failure rate estimation, and circuit optimization. As the post-Moore era brings about new technologies (such as silicon photonics and quantum circuits), many of the associated issues there are similar to those encountered in electronic IC design and can be addressed using Bayesian methods. Motivated by this observation, we present a comprehensive review of Bayesian methods in electronic design automation (EDA). By doing so, we hope to equip researchers and designers with the ability to apply Bayesian methods in solving stochastic problems in electronic circuits and beyond.Comment: 24 pages, a draft version. We welcome comments and feedback, which can be sent to [email protected]

    Circuit Design

    Get PDF
    Circuit Design = Science + Art! Designers need a skilled "gut feeling" about circuits and related analytical techniques, plus creativity, to solve all problems and to adhere to the specifications, the written and the unwritten ones. You must anticipate a large number of influences, like temperature effects, supply voltages changes, offset voltages, layout parasitics, and numerous kinds of technology variations to end up with a circuit that works. This is challenging for analog, custom-digital, mixed-signal or RF circuits, and often researching new design methods in relevant journals, conference proceedings and design tools unfortunately gives the impression that just a "wild bunch" of "advanced techniques" exist. On the other hand, state-of-the-art tools nowadays indeed offer a good cockpit to steer the design flow, which include clever statistical methods and optimization techniques.Actually, this almost presents a second breakthrough, like the introduction of circuit simulators 40 years ago! Users can now conveniently analyse all the problems (discover, quantify, verify), and even exploit them, for example for optimization purposes. Most designers are caught up on everyday problems, so we fit that "wild bunch" into a systematic approach for variation-aware design, a designer's field guide and more. That is where this book can help! Circuit Design: Anticipate, Analyze, Exploit Variations starts with best-practise manual methods and links them tightly to up-to-date automation algorithms. We provide many tractable examples and explain key techniques you have to know. We then enable you to select and setup suitable methods for each design task - knowing their prerequisites, advantages and, as too often overlooked, their limitations as well. The good thing with computers is that you yourself can often verify amazing things with little effort, and you can use software not only to your direct advantage in solving a specific problem, but also for becoming a better skilled, more experienced engineer. Unfortunately, EDA design environments are not good at all to learn about advanced numerics. So with this book we also provide two apps for learning about statistic and optimization directly with circuit-related examples, and in real-time so without the long simulation times. This helps to develop a healthy statistical gut feeling for circuit design. The book is written for engineers, students in engineering and CAD / methodology experts. Readers should have some background in standard design techniques like entering a design in a schematic capture and simulating it, and also know about major technology aspects

    Design Methods for Reducing Failure Probabilities with Examples from Electrical Engineering

    Get PDF
    This thesis addresses the quantification of uncertainty and optimization under uncertainty. We focus on uncertainties in the manufacturing process of devices, e.g. caused by manufacturing imperfections, natural material deviations or environmental influences. These uncertainties may lead to deviations in the geometry or the materials, which may cause deviations in the operation of the device. The term yield refers to the fraction of realizations in a manufacturing process under uncertainty, fulfilling all performance requirements. It is the counterpart of the failure probability (yield = 1 - failure probability) and serves as a measure for (un)certainty. The main goal of this work is to efficiently estimate and to maximize the yield. In this way, we increase the reliability of designs which reduces rejects of devices due to malfunction and hence saves resources, money and time. One main challenge in the field of yield estimation is the reduction of computing effort, maintaining high accuracy. In this work we propose two hybrid yield estimation methods. Both are sampling based and evaluate most of the sample points on a surrogate model, while only a small subset of so-called critical sample points is evaluated on the original high fidelity model. The SC-Hybrid approach is based on stochastic collocation and adjoint error indicators. The non-intrusive GPR-Hybrid approach uses Gaussian process regression and allows surrogate model updates on the fly. For efficient yield optimization we propose the adaptive Newton-Monte-Carlo (Newton-MC) method, where the sample size is adaptively increased. Another topic is the optimization of problems with mixed gradient information, i.e., problems, where the derivatives of the objective function are available with respect to some optimization variables, but not for all. The usage of gradient based solvers like the adaptive Newton-MC would require the costly approximation of the derivatives. We propose two methods for this case: the Hermite least squares and the Hermite BOBYQA optimization. Both are modifications of the originally derivative free BOBYQA (Bound constrained Optimization BY Quadratic Approximation) method, but are able to handle derivative information and use least squares regression instead of interpolation. In addition, an advantage of the Hermite-type approaches is their robustness in case of noisy objective functions. The global convergence of these methods is proven. In the context of yield optimization the case of mixed gradient information is particularly relevant, if - besides Gaussian distributed uncertain optimization variables - there are deterministic or non-Gaussian distributed uncertain optimization variables. The proposed methods can be applied to any design process affected by uncertainties. However, in this work we focus on application to the design of electrotechnical devices. We evaluate the approaches on two benchmark problems, a rectangular waveguide and a permanent magnet synchronous machine (PMSM). Significant savings of computing effort can be observed in yield estimation, and single- and multi-objective yield optimization. This allows the application of design optimization under uncertainty in industry

    Circuit Design

    Get PDF
    Circuit Design = Science + Art! Designers need a skilled "gut feeling" about circuits and related analytical techniques, plus creativity, to solve all problems and to adhere to the specifications, the written and the unwritten ones. You must anticipate a large number of influences, like temperature effects, supply voltages changes, offset voltages, layout parasitics, and numerous kinds of technology variations to end up with a circuit that works. This is challenging for analog, custom-digital, mixed-signal or RF circuits, and often researching new design methods in relevant journals, conference proceedings and design tools unfortunately gives the impression that just a "wild bunch" of "advanced techniques" exist. On the other hand, state-of-the-art tools nowadays indeed offer a good cockpit to steer the design flow, which include clever statistical methods and optimization techniques.Actually, this almost presents a second breakthrough, like the introduction of circuit simulators 40 years ago! Users can now conveniently analyse all the problems (discover, quantify, verify), and even exploit them, for example for optimization purposes. Most designers are caught up on everyday problems, so we fit that "wild bunch" into a systematic approach for variation-aware design, a designer's field guide and more. That is where this book can help! Circuit Design: Anticipate, Analyze, Exploit Variations starts with best-practise manual methods and links them tightly to up-to-date automation algorithms. We provide many tractable examples and explain key techniques you have to know. We then enable you to select and setup suitable methods for each design task - knowing their prerequisites, advantages and, as too often overlooked, their limitations as well. The good thing with computers is that you yourself can often verify amazing things with little effort, and you can use software not only to your direct advantage in solving a specific problem, but also for becoming a better skilled, more experienced engineer. Unfortunately, EDA design environments are not good at all to learn about advanced numerics. So with this book we also provide two apps for learning about statistic and optimization directly with circuit-related examples, and in real-time so without the long simulation times. This helps to develop a healthy statistical gut feeling for circuit design. The book is written for engineers, students in engineering and CAD / methodology experts. Readers should have some background in standard design techniques like entering a design in a schematic capture and simulating it, and also know about major technology aspects

    A blackbox yield estimation workflow with Gaussian process regression applied to the design of electromagnetic devices

    Get PDF
    In this paper an efficient and reliable method for stochastic yield estimation is presented. Since one main challenge of uncertainty quantification is the computational feasibility, we propose a hybrid approach where most of the Monte Carlo sample points are evaluated with a surrogate model, and only a few sample points are reevaluated with the original high fidelity model. Gaussian process regression is a non-intrusive method which is used to build the surrogate model. Without many prerequisites, this gives us not only an approximation of the function value, but also an error indicator that we can use to decide whether a sample point should be reevaluated or not. For two benchmark problems, a dielectrical waveguide and a lowpass filter, the proposed methods outperform classic approaches

    Advancing process development for antibody-drug conjugates - Incorporation of high-throughput, analytical, and digital tools

    Get PDF
    Antibody-drug conjugates (ADCs) have been designed as a combination of monoclonal antibody (mAb) therapy and chemotherapy. From this fact, they draw their potential of uniting the advantages of both strategies in one molecule. mAbs have the ability to specifically bind their target antigen, thus focusing the effect on the target site of action. Due to their size and other biochemical properties, they have a good circulation half-life in the body, which is an important pharmacokinetic property. While mAbs are applied in various therapeutic fields, they form a highly important part of modern oncology. Here, mAbs are used to target antigens that are highly expressed on cancer cells, exhibiting different modes of action to fight the cancer. In order to increase their capacity of killing cancer cells, small cytotoxic molecules, as applied in chemotherapy, can be covalently attached to the mAbs, forming ADCs. Due to the decreased systemic exposure, drug molecules with higher cytotoxicity can be used. Motivated by this potential and the market approval of the first successful products in 2011 and 2013, ADCs gained a lot of attention. By the end of 2019, there were already six products on the market and over 60 candidates in clinical trials. Substantial progress has been made in areas like the development of new cytotoxic drugs, linker chemistries, and conjugation strategies. Despite these successes, the development of new ADCs remains challenging. Unfavorable pharmacokinetic profiles caused by the hydrophobic nature of the drugs and heterogeneity in the degree and site of conjugation are factors which are being improved for current ADCs. Solutions include, for example, site-specific conjugation strategies. Still, the number of parameters for optimization is high for these complex hybrid molecules. Issues range from antibody, drug, and linker over attachment chemistry to the optimal drug-to-antibody ratio (DAR). In order to unlock the full potential of ADCs, efficient, knowledge-based process development is necessary. Also looking at the current landscape of biopharmaceutical development, it is evident that there is high pressure on process developers to efficiently deliver robust processes while gathering enhanced knowledge on process and product. One reason is the diversification of the product pipeline caused by emerging new modalities like ADCs and other antibody formats or cell and gene therapy. It increases development efforts and hinders the use of platform approaches. In addition, time to market gets more crucial with rising development costs and growing global competition, for example by producers of so-called biosimilars. Finally, it is promoted by regulatory agencies like the U.S. Food and Drug Administration or the European Medicines Agency that the concept of quality by design (QbD) is implemented in pharmaceutical development. Its goal is for processes to be designed in a way that the desired product performance is robustly achieved in a controlled fashion. It requires increased process understanding and the thorough characterization of the relationship between critical process parameters and critical quality attributes of the product. The goal of this thesis is to advance the process development of ADCs in the direction of more efficient, systematic, and knowledge-based approaches. As a strategy for the realization of this objective, the establishment of high-throughput, analytical, and digital tools for ADC processes was investigated. High-throughput tools, especially in combination with design of experiments (DoE), can lead to a strong increase in efficiency regarding time as well as material consumption. In order to prevent an analytical bottle neck, high-throughput compatible analytics are crucial. Also analytical techniques for the on-line monitoring of processes have great benefit. They are the basis for implementing process analytical technology (PAT) tools, which give the opportunity for real-time monitoring and control of product quality attributes. Digital tools, such as methods for the mechanistic modeling and simulation of processes, offer many advantages for process development. Apart from granting a deeper understanding of the process fundamentals, mechanistic models can be efficient tools for process optimization and characterization of the design space. The methods for ADC process development applied or developed in this work did not rely on the highly toxic drugs used in ADCs. Instead, nontoxic surrogate drug molecules, similar in relevant properties like size and hydrophobicity as commonly used cytotoxic drugs in ADCs, were employed. The applied combination of cysteine-engineered mAb and maleimide conjugation chemistry is a strategy for site-specific conjugation with high relevance for ADC development. In the first part of this thesis, a high-throughput process development platform for site-specific conjugation processes was developed1. The multi-step process of making ADCs from cysteine-engineered mAbs was successfully transferred to a robotic liquid handling station. This included a high-throughput buffer exchange step using cation-exchange batch adsorption and the subsequent automated protein quantification with process feedback. As high-throughput compatible analytics, a reversed-phase ultra-high performance liquid chromatography (RP-UHPLC) method without sample preparation was developed, focusing on a short runtime for high efficiency. The final platform was used in a conjugation DoE, showing the capacity of the method for efficient process characterization. Finally, the comparability of the high-throughput results with experiments in a larger scale was demonstrated. The second part describes the establishment of an on-line monitoring approach for ADC conjugation reactions using UV/Vis spectroscopy2. First, a spectral change caused by the conjugation of the maleimide-functionalized surrogate drug to the thiol group of the engineered cysteines was detected. Spectra were recorded during the reaction in two setups with different detectors. Subsequently, the spectral change was correlated to off-line concentration data measured by RP-UHPLC using partial least-squares (PLS) regression. The calibrated PLS models enabled the prediction of the amount of conjugated drug directly from UV/Vis spectra. Both external validation data sets as well as cross-validation were used for model validation. The successful prediction of the reaction progress was shown with two different surrogate drugs in both setups. After covering high-throughput tools, analytics, and process monitoring in the first and second parts, the third part focuses on applying mechanistic understanding towards conjugation process development. In this section, a kinetic reaction model for the conjugation of ADCs was established and the application of the mechanistic model to process development was investigated3. Before model calibration, six model structures were set up based on different assumptions regarding the binding to the two available cysteines. All six models were fit to a calibration data set and the best model was selected using cross-validation. The results suggest that the attachment of a first drug to the mAb influences the attachment to the second binding site. An external data set including data outside the calibration range was used for the successful validation of the model. The validated model was then applied to an in silico screening and optimization of the conjugation process, enabling the selection of conditions with efficient drug use and high yield of the target component. Additional process understanding was generated by showing a positive effect of different salts on the reaction rate. Finally, a combination of the kinetic model with the monitoring approach of the second part was investigated. While the previous parts are primarily concerned with the conjugation reaction itself, the fourth part deals with the subsequent purification of the ADCs. A mechanistic model was established for the separation of ADC species with different DAR using hydrophobic interaction chromatography (HIC)4. This separation allows to set the target DAR also post-conjugation. For modeling the transport of solutes through the column and the adsorption equilibrium, the transport-dispersive model and a suitable adsorption isotherm were applied. First of all, a detailed characterization of the chromatography system and column was conducted, which served the calculation of a number of model parameters. The rest of the model parameters were determined by parameter estimation using numerical simulations. For the calibration, nine experiments with different linear and step gradients were run with varying load compositions. Peak positions as well as peak shapes were accurately described by the model for all components. Applying the final model to process optimization gave step gradients with improved yield, DAR, and concentration in the pool. The successful prediction of yield and DAR in the pool of the optimized gradients was validated with external data. In a first in silico study, model-based process control was used to react to variations in the preceding unit operation, ensuring a robust achievement of a critical quality attribute, the target DAR. A second in silico study shows that a linkage of the HIC model with the kinetic reaction model developed in the third part of this thesis can be profitably applied to process development. This ‘digital twin’ widens the system boundaries over two adjacent unit operations, which could enable the establishment of a flexible design space over more than one process step. In conclusion, the present thesis helps to shape the ADC process development of the future, able to cope with the challenges of a transforming biopharmaceutical industry. The whole process from the preparation of the conjugation sites over the conjugation reaction through to the purification of the conjugates was covered. Efficient characterization of the design space was demonstrated by incorporating tools like high-throughput experimentation combined with DoE, and mechanistic modeling techniques. The implementation of QbD relies on the establishment of suitable tools for acquiring enhanced process knowledge and for process monitoring and control. To this end, a PAT method for conjugation monitoring based on multivariate data analysis, and mechanistic models for conjugation and purification were developed. The presented studies showcase the realization of new ideas for exploiting the potential of digital tools for the specific challenges of ADC process development

    A Data Mining Methodology for Vehicle Crashworthiness Design

    Get PDF
    This study develops a systematic design methodology based on data mining theory for decision-making in the development of crashworthy vehicles. The new data mining methodology allows the exploration of a large crash simulation dataset to discover the underlying relationships among vehicle crash responses and design variables at multiple levels and to derive design rules based on the whole-vehicle safety requirements to make decisions about component-level and subcomponent-level design. The method can resolve a major issue with existing design approaches related to vehicle crashworthiness: that is, limited abilities to explore information from large datasets, which may hamper decision-making in the design processes. At the component level, two structural design approaches were implemented for detailed component design with the data mining method: namely, a dimension-based approach and a node-based approach to handle structures with regular and irregular shapes, respectively. These two approaches were used to design a thin-walled vehicular structure, the S-shaped beam, against crash loading. A large number of design alternatives were created, and their responses under loading were evaluated by finite element simulations. The design variables and computed responses formed a large design dataset. This dataset was then mined to build a decision tree. Based on the decision tree, the interrelationships among the design parameters were revealed, and design rules were generated to produce a set of good designs. After the data mining, the critical design parameters were identified and the design space was reduced, which can simplify the design process. To partially replace the expensive finite element simulations, a surrogate model was used to model the relationships between design variables and response. Four machine learning algorithms, which can be used for surrogate model development, were compared. Based on the results, Gaussian process regression was determined to be the most suitable technique in the present scenario, and an optimization process was developed to tune the algorithm’s hyperparameters, which govern the model structure and training process. To account for engineering uncertainty in the data mining method, a new decision tree for uncertain data was proposed based on the joint probability in uncertain spaces, and it was implemented to again design the S-beam structure. The findings show that the new decision tree can produce effective decision-making rules for engineering design under uncertainty. To evaluate the new approaches developed in this work, a comprehensive case study was conducted by designing a vehicle system against the frontal crash. A publicly available vehicle model was simplified and validated. Using the newly developed approaches, new component designs in this vehicle were generated and integrated back into the vehicle model so their crash behavior could be simulated. Based on the simulation results, one can conclude that the designs with the new method can outperform the original design in terms of measures of mass, intrusion and peak acceleration. Therefore, the performance of the new design methodology has been confirmed. The current study demonstrates that the new data mining method can be used in vehicle crashworthiness design, and it has the potential to be applied to other complex engineering systems with a large amount of design data

    Maximizing mRNA vaccine production with Bayesian optimization

    Get PDF
    Messenger RNA (mRNA) vaccines are a new alternative to conventional vaccines with a prominent role in infectious disease control. These vaccines are produced in in vitro transcription (IVT) reactions, catalyzed by RNA polymerase in cascade reactions. To ensure an efficient and cost-effective manufacturing process, essential for a large-scale production and effective vaccine supply chain, the IVT reaction needs to be optimized. IVT is a complex reaction that contains a large number of variables that can affect its outcome. Traditional optimization methods rely on classic Design of Experiments methods, which are time-consuming and can present human bias or based on simplified assumptions. In this contribution, we propose the use of Machine Learning approaches to perform a data-driven optimization of an mRNA IVT reaction. A Bayesian optimization method and model interpretability techniques were used to automate experiment design, providing a feedback loop. IVT reaction conditions were found under 60 optimization runs that produced 12 g · L−1 in solely 2 h. The results obtained outperform published industry standards and data reported in literature in terms of both achievable reaction yield and reduction of production time. Furthermore, this shows the potential of Bayesian optimization as a cost-effective optimization tool within (bio)chemical applications
    • …
    corecore