102 research outputs found
A Review of Bayesian Methods in Electronic Design Automation
The utilization of Bayesian methods has been widely acknowledged as a viable
solution for tackling various challenges in electronic integrated circuit (IC)
design under stochastic process variation, including circuit performance
modeling, yield/failure rate estimation, and circuit optimization. As the
post-Moore era brings about new technologies (such as silicon photonics and
quantum circuits), many of the associated issues there are similar to those
encountered in electronic IC design and can be addressed using Bayesian
methods. Motivated by this observation, we present a comprehensive review of
Bayesian methods in electronic design automation (EDA). By doing so, we hope to
equip researchers and designers with the ability to apply Bayesian methods in
solving stochastic problems in electronic circuits and beyond.Comment: 24 pages, a draft version. We welcome comments and feedback, which
can be sent to [email protected]
Robust Decision Trees Against Adversarial Examples
Although adversarial examples and model robustness have been extensively
studied in the context of linear models and neural networks, research on this
issue in tree-based models and how to make tree-based models robust against
adversarial examples is still limited. In this paper, we show that tree based
models are also vulnerable to adversarial examples and develop a novel
algorithm to learn robust trees. At its core, our method aims to optimize the
performance under the worst-case perturbation of input features, which leads to
a max-min saddle point problem. Incorporating this saddle point objective into
the decision tree building procedure is non-trivial due to the discrete nature
of trees --- a naive approach to finding the best split according to this
saddle point objective will take exponential time. To make our approach
practical and scalable, we propose efficient tree building algorithms by
approximating the inner minimizer in this saddle point problem, and present
efficient implementations for classical information gain based trees as well as
state-of-the-art tree boosting models such as XGBoost. Experimental results on
real world datasets demonstrate that the proposed algorithms can substantially
improve the robustness of tree-based models against adversarial examples
Characterization and Modeling of Chemical-Mechanical Polishing for Polysilicon Microstructures
Long the dominant method of wafer planarization in the integrated circuit (IC) industry, chemical-mechanical polishing is starting to play an important role in microelectromechnical systems (MEMS). We present an experiment to characterize a polysilicon CMP process with the specific goal of examining MEMS sized test structures. We utilize previously discussed models and examine whether the same assumptions from IC CMP can be made for MEMS CMP. We find that CMP at the MEMS scale is not just pattern density dependent, but also partly dependent on feature size. Also, we find that new layout designs relevant to MEMS can negatively impact how well existing CMP models simulate polishing, motivating the need for further model development.Singapore-MIT Alliance (SMA
Semiconductor process design : representations, tools, and methodologies
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1991.Vita.Includes bibliographical references (p. 267-278).by Duane S. Boning.Ph.D
Multi-strata subsurface laser die singulation to enable defect-free ultra-thin stacked memory dies
We report the extension of multi-strata subsurface infrared (1.342 μm) pulsed laser die singulation to the fabrication of defect-free ultra-thin stacked memory dies. We exploit the multi-strata interactions between generated thermal shockwaves and the preceding high dislocation density layers formed to initiate crack fractures that separate the individual dies from within the interior of the die. We show that optimized inter-strata distances between the high dislocation density layers together with effective laser energy dose can be used to compensate for the high backside reflectance (up to ∼ 82%) wafers. This work has successfully demonstrated defect-free eight die stacks of 25 μm thick mechanically functional and 46 μm thick electrically functional memory dies.Sandisk SemiConductor Shanghai Co Ltd.Leaders for Global Operations ProgramNoyce Foundation (Robert N. Noyce full scholarship
A 12b 250 MS/s Pipelined ADC With Virtual Ground Reference Buffers
The virtual ground reference buffer (VGRB) technique is introduced as a means to improve the performance of switched-capacitor circuits. The technique enhances the performance by improving the feedback factor of the op-amp without affecting the signal gain. The bootstrapping action of the level-shifting buffers relaxes key op-amp performance requirements including unity-gain bandwidth, noise, open-loop gain and offset compared with conventional circuits. This reduces the design complexity and the power consumption of op-amp based circuits. Based on this technique, a 12 b pipelined ADC is implemented in 65 nm CMOS that achieves 67.0 dB SNDR at 250 MS/s and consumes 49.7 mW of power from a 1.2 V power supply
A 12b 50MS/s 2.1mW SAR ADC with redundancy and digital background calibration
A 12-bit 50MS/s SAR ADC implemented in 65nm CMOS technology is presented. The design employs redundancy to relax the DAC settling requirement and to provide sufficient room for errors such that the static nonlinearity caused by capacitor mismatches can be digitally removed. The redundancy is incorporated into the design using a tri-level switching scheme and our modified split-capacitor array to achieve the highest switching efficiency while still preserving the symmetry in error tolerance. A new code-density based digital background calibration algorithm that requires no special calibration signals or additional analog hardware is also developed. The calibration is performed by using the input signal as stimulus and the effectiveness is verified both in simulation and through measured data. The prototype achieves a 67.4dB SNDR at 50MS/s, while dissipating 2.1mW from a 1.2V supply, leading to FoM of 21.9fJ/conv.-step at Nyquist frequency.MIT Masdar Progra
Rare Event Probability Learning by Normalizing Flows
A rare event is defined by a low probability of occurrence. Accurate
estimation of such small probabilities is of utmost importance across diverse
domains. Conventional Monte Carlo methods are inefficient, demanding an
exorbitant number of samples to achieve reliable estimates. Inspired by the
exact sampling capabilities of normalizing flows, we revisit this challenge and
propose normalizing flow assisted importance sampling, termed NOFIS. NOFIS
first learns a sequence of proposal distributions associated with predefined
nested subset events by minimizing KL divergence losses. Next, it estimates the
rare event probability by utilizing importance sampling in conjunction with the
last proposal. The efficacy of our NOFIS method is substantiated through
comprehensive qualitative visualizations, affirming the optimality of the
learned proposal distribution, as well as a series of quantitative experiments
encompassing distinct test cases, which highlight NOFIS's superiority over
baseline approaches.Comment: 16 pages, 5 figures, 2 table
- …