3,020 research outputs found
A Review of Bayesian Methods in Electronic Design Automation
The utilization of Bayesian methods has been widely acknowledged as a viable
solution for tackling various challenges in electronic integrated circuit (IC)
design under stochastic process variation, including circuit performance
modeling, yield/failure rate estimation, and circuit optimization. As the
post-Moore era brings about new technologies (such as silicon photonics and
quantum circuits), many of the associated issues there are similar to those
encountered in electronic IC design and can be addressed using Bayesian
methods. Motivated by this observation, we present a comprehensive review of
Bayesian methods in electronic design automation (EDA). By doing so, we hope to
equip researchers and designers with the ability to apply Bayesian methods in
solving stochastic problems in electronic circuits and beyond.Comment: 24 pages, a draft version. We welcome comments and feedback, which
can be sent to [email protected]
Recommended from our members
Statistical Rate Event Analysis with Elite Sample Selection Scheme
Accurately estimating the failure region of rare events for memory-cell and analog circuit blocks under process variations is a challenging task. As the first part of the thesis, author propose a new statistical method, called EliteScope to estimate the circuit failure rates in rare event regions and to provide conditions of parameters to achieve targeted per- formance. The new method is based on the iterative blockade framework to reduce the number of samples. But it consists of two new techniques to improve existing methods. First, the new approach employs an elite learning sample selection scheme, which can con- sider the effectiveness of samples and well-coverage for the parameter space. As a result, it can reduce additional simulation costs by pruning less effective samples while keeping the accuracy of failure estimation. Second, the EliteScope identifies the failure regions in terms of parameter spaces to provide a good design guidance to accomplish the performance target. It applies variance based feature selection to find the dominant parameters and then determine the in-spec boundaries of those parameters. We demonstrate the advantage of our proposed method using several memory and analog circuits with different number of process parameters. Experiments on four circuit examples show that EliteScope achieves a significant improvement on failure region estimation in terms of accuracy and simulation cost over traditional approaches. The 16-bit 6T-SRAM column example also demonstrate that the new method is scalable for handling large problems with large number of process variables
Rare Event Probability Learning by Normalizing Flows
A rare event is defined by a low probability of occurrence. Accurate
estimation of such small probabilities is of utmost importance across diverse
domains. Conventional Monte Carlo methods are inefficient, demanding an
exorbitant number of samples to achieve reliable estimates. Inspired by the
exact sampling capabilities of normalizing flows, we revisit this challenge and
propose normalizing flow assisted importance sampling, termed NOFIS. NOFIS
first learns a sequence of proposal distributions associated with predefined
nested subset events by minimizing KL divergence losses. Next, it estimates the
rare event probability by utilizing importance sampling in conjunction with the
last proposal. The efficacy of our NOFIS method is substantiated through
comprehensive qualitative visualizations, affirming the optimality of the
learned proposal distribution, as well as a series of quantitative experiments
encompassing distinct test cases, which highlight NOFIS's superiority over
baseline approaches.Comment: 16 pages, 5 figures, 2 table
Robust and Efficient Uncertainty Quantification and Validation of RFIC Isolation
Modern communication and identification products impose demanding constraints on reliability of components. Due to this statistical constraints more and more enter optimization formulations of electronic products. Yield constraints often require efficient sampling techniques to obtain uncertainty quantification also at the tails of the distributions. These sampling techniques should outperform standard Monte Carlo techniques, since these latter ones are normally not efficient enough to deal with tail probabilities. One such a technique, Importance Sampling, has successfully been applied to optimize Static Random Access Memories (SRAMs) while guaranteeing very small failure probabilities, even going beyond 6-sigma variations of parameters involved. Apart from this, emerging uncertainty quantifications techniques offer expansions of the solution that serve as a response surface facility when doing statistics and optimization. To efficiently derive the coefficients in the expansions one either has to solve a large number of problems or a huge combined problem. Here parameterized Model Order Reduction (MOR) techniques can be used to reduce the work load. To also reduce the amount of parameters we identify those that only affect the variance in a minor way. These parameters can simply be set to a fixed value. The remaining parameters can be viewed as dominant. Preservation of the variation also allows to make statements about the approximation accuracy obtained by the parameter-reduced problem. This is illustrated on an RLC circuit. Additionally, the MOR technique used should not affect the variance significantly. Finally we consider a methodology for reliable RFIC isolation using floor-plan modeling and isolation grounding. Simulations show good comparison with measurements
Circuit Design
Circuit Design = Science + Art! Designers need a skilled "gut feeling" about circuits and related analytical techniques, plus creativity, to solve all problems and to adhere to the specifications, the written and the unwritten ones. You must anticipate a large number of influences, like temperature effects, supply voltages changes, offset voltages, layout parasitics, and numerous kinds of technology variations to end up with a circuit that works. This is challenging for analog, custom-digital, mixed-signal or RF circuits, and often researching new design methods in relevant journals, conference proceedings and design tools unfortunately gives the impression that just a "wild bunch" of "advanced techniques" exist. On the other hand, state-of-the-art tools nowadays indeed offer a good cockpit to steer the design flow, which include clever statistical methods and optimization techniques.Actually, this almost presents a second breakthrough, like the introduction of circuit simulators 40 years ago! Users can now conveniently analyse all the problems (discover, quantify, verify), and even exploit them, for example for optimization purposes. Most designers are caught up on everyday problems, so we fit that "wild bunch" into a systematic approach for variation-aware design, a designer's field guide and more. That is where this book can help! Circuit Design: Anticipate, Analyze, Exploit Variations starts with best-practise manual methods and links them tightly to up-to-date automation algorithms. We provide many tractable examples and explain key techniques you have to know. We then enable you to select and setup suitable methods for each design task - knowing their prerequisites, advantages and, as too often overlooked, their limitations as well. The good thing with computers is that you yourself can often verify amazing things with little effort, and you can use software not only to your direct advantage in solving a specific problem, but also for becoming a better skilled, more experienced engineer. Unfortunately, EDA design environments are not good at all to learn about advanced numerics. So with this book we also provide two apps for learning about statistic and optimization directly with circuit-related examples, and in real-time so without the long simulation times. This helps to develop a healthy statistical gut feeling for circuit design. The book is written for engineers, students in engineering and CAD / methodology experts. Readers should have some background in standard design techniques like entering a design in a schematic capture and simulating it, and also know about major technology aspects
Circuit Design
Circuit Design = Science + Art! Designers need a skilled "gut feeling" about circuits and related analytical techniques, plus creativity, to solve all problems and to adhere to the specifications, the written and the unwritten ones. You must anticipate a large number of influences, like temperature effects, supply voltages changes, offset voltages, layout parasitics, and numerous kinds of technology variations to end up with a circuit that works. This is challenging for analog, custom-digital, mixed-signal or RF circuits, and often researching new design methods in relevant journals, conference proceedings and design tools unfortunately gives the impression that just a "wild bunch" of "advanced techniques" exist. On the other hand, state-of-the-art tools nowadays indeed offer a good cockpit to steer the design flow, which include clever statistical methods and optimization techniques.Actually, this almost presents a second breakthrough, like the introduction of circuit simulators 40 years ago! Users can now conveniently analyse all the problems (discover, quantify, verify), and even exploit them, for example for optimization purposes. Most designers are caught up on everyday problems, so we fit that "wild bunch" into a systematic approach for variation-aware design, a designer's field guide and more. That is where this book can help! Circuit Design: Anticipate, Analyze, Exploit Variations starts with best-practise manual methods and links them tightly to up-to-date automation algorithms. We provide many tractable examples and explain key techniques you have to know. We then enable you to select and setup suitable methods for each design task - knowing their prerequisites, advantages and, as too often overlooked, their limitations as well. The good thing with computers is that you yourself can often verify amazing things with little effort, and you can use software not only to your direct advantage in solving a specific problem, but also for becoming a better skilled, more experienced engineer. Unfortunately, EDA design environments are not good at all to learn about advanced numerics. So with this book we also provide two apps for learning about statistic and optimization directly with circuit-related examples, and in real-time so without the long simulation times. This helps to develop a healthy statistical gut feeling for circuit design. The book is written for engineers, students in engineering and CAD / methodology experts. Readers should have some background in standard design techniques like entering a design in a schematic capture and simulating it, and also know about major technology aspects
Adaptive Planning Search Algorithm for Analog Circuit Verification
Integrated circuit verification has gathered considerable interest in recent
times. Since these circuits keep growing in complexity year by year,
pre-Silicon (pre-SI) verification becomes ever more important, in order to
ensure proper functionality. Thus, in order to reduce the time needed for
manually verifying ICs, we propose a machine learning (ML) approach, which uses
less simulations. This method relies on an initial evaluation set of operating
condition configurations (OCCs), in order to train Gaussian process (GP)
surrogate models. By using surrogate models, we can propose further, more
difficult OCCs. Repeating this procedure for several iterations has shown
better GP estimation of the circuit's responses, on both synthetic and real
circuits, resulting in a better chance of finding the worst case, or even
failures, for certain circuit responses. Thus, we show that the proposed
approach is able to provide OCCs closer to the specifications for all circuits
and identify a failure (specification violation) for one of the responses of a
real circuit
Statistical Performance Modeling of SRAMs
Yield analysis is a critical step in memory designs considering a variety of performance constraints. Traditional circuit level Monte-Carlo simulations for yield estimation of Static Random Access Memory (SRAM) cell is quite time consuming due to their characteristic of low failure rate, while statistical method of yield sensitivity analysis is meaningful for its high efficiency.
This thesis proposes a novel statistical model to conduct yield sensitivity prediction on SRAM cells at the simulation level, which excels regular circuit simulations in a significant runtime speedup. Based on the theory of Kriging method that is widely used in geostatistics, we develop a series of statistical model building and updating strategies to obtain satisfactory accuracy and efficiency in SRAM yield sensitivity analysis.
Generally, this model applies to the yield and sensitivity evaluation with varying design parameters, under the constraints of most SRAM performance metric. Moreover, it is potentially suitable for any designated distribution of the process variation regardless of the sampling method
Statistical library characterization using belief propagation across multiple technology nodes
In this paper, we propose a novel flow to enable computationally efficient statistical characterization of delay and slew in standard cell libraries. The distinguishing feature of the proposed method is the usage of a limited combination of output capacitance, input slew rate and supply voltage for the extraction of statistical timing metrics of an individual logic gate. The efficiency of the proposed flow stems from the introduction of a novel, ultra-compact, nonlinear, analytical timing model, having only four universal regression parameters. This novel model facilitates the use of maximum-a-posteriori belief propagation to learn the prior parameter distribution for the parameters of the target technology from past characterizations of library cells belonging to various other technologies, including older ones. The framework then utilises Bayesian inference to extract the new timing model parameters using an ultra-small set of additional timing measurements from the target technology. The proposed method is validated and benchmarked on several production-level cell libraries including a state-of-the-art 14-nm technology node and a variation-aware, compact transistor model. For the same accuracy as the conventional lookup-table approach, this new method achieves at least 15x reduction in simulation runs.Masdar Institute of Science and Technology (Massachusetts Institute of Technology Cooperative Agreement
- …