15 research outputs found

    Substrate noise analysis and techniques for mitigation in mixed-signal RF systems

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Includes bibliographical references (p. 151-158).Mixed-signal circuit design has historically been a challenge for several reasons. Parasitic interactions between analog and digital systems on a single die are one such challenge. Switching transients induced by digital circuits inject noise into the common substrate creating substrate noise. Analog circuits lack the large noise margins of digital circuits, thus making them susceptible to substrate voltage variations. This problem is exacerbated at higher frequencies as the effectiveness of standard isolation technique diminishes considerably. Historically, substrate noise was not a problem because each system was fabricated in its own package shielding it from such interactions. The work in this thesis spans all areas of substrate noise: generation, propagation, and reception. A set of guidelines in designing isolation structures was developed to assist designers in optimizing these structures for a particular application. Furthermore, the effect of substrate noise on two key components of the RF front end, the voltage controlled oscillator (VCO) and the low noise amplifier (LNA), was analyzed. Finally, a CAD tool (SNAT) was developed to efficiently simulate large digital designs to determine substrate noise performance.(cont.) Existing techniques have prohibitively long simulation times and are only suitable for final verification. Determination of substrate noise coupling during the design phase would be extremely beneficial to circuit designers who can incorporate the effect of the noise and re-design accordingly before fabrication. This would reduce the turn around time for circuits and prevent costly redesign. SNAT can be used at any stage of the design cycle to accurately predict (less than 12% error when compared to measurements) the substrate noise performance of any digital circuit with a large degree of computational efficiency.by Nisha Checka.Ph.D

    Learning Approaches to Analog and Mixed Signal Verification and Analysis

    Get PDF
    The increased integration and interaction of analog and digital components within a system has amplified the need for a fast, automated, combined analog, and digital verification methodology. There are many automated characterization, test, and verification methods used in practice for digital circuits, but analog and mixed signal circuits suffer from long simulation times brought on by transistor-level analysis. Due to the substantial amount of simulations required to properly characterize and verify an analog circuit, many undetected issues manifest themselves in the manufactured chips. Creating behavioral models, a circuit abstraction of analog components assists in reducing simulation time which allows for faster exploration of the design space. Traditionally, creating behavioral models for non-linear circuits is a manual process which relies heavily on design knowledge for proper parameter extraction and circuit abstraction. Manual modeling requires a high level of circuit knowledge and often fails to capture critical effects stemming from block interactions and second order device effects. For this reason, it is of interest to extract the models directly from the SPICE level descriptions so that these effects and interactions can be properly captured. As the devices are scaled, process variations have a more profound effect on the circuit behaviors and performances. Creating behavior models from the SPICE level descriptions, which include input parameters and a large process variation space, is a non-trivial task. In this dissertation, we focus on addressing various problems related to the design automation of analog and mixed signal circuits. Analog circuits are typically highly specialized and fined tuned to fit the desired specifications for any given system reducing the reusability of circuits from design to design. This hinders the advancement of automating various aspects of analog design, test, and layout. At the core of many automation techniques, simulations, or data collection are required. Unfortunately, for some complex analog circuits, a single simulation may take many days. This prohibits performing any type of behavior characterization or verification of the circuit. This leads us to the first fundamental problem with the automation of analog devices. How can we reduce the simulation cost while maintaining the robustness of transistor level simulations? As analog circuits can vary vastly from one design to the next and are hardly ever comprised of standard library based building blocks, the second fundamental question is how to create automated processes that are general enough to be applied to all or most circuit types? Finally, what circuit characteristics can we utilize to enhance the automation procedures? The objective of this dissertation is to explore these questions and provide suitable evidence that they can be answered. We begin by exploring machine learning techniques to model the design space using minimal simulation effort. Circuit partitioning is employed to reduce the complexity of the machine learning algorithms. Using the same partitioning algorithm we further explore the behavior characterization of analog circuits undergoing process variation. The circuit partitioning is general enough to be used by any CMOS based analog circuit. The ideas and learning gained from behavioral modeling during behavior characterization are used to improve the simulation through event propagation, input space search, complexity and information measurements. The reduction of the input space and behavioral modeling of low complexity, low information primitive elements reduces the simulation time of large analog and mixed signal circuits by 50-75%. The method is extended and applied to assist in analyzing analog circuit layout. All of the proposed methods are implemented on analog circuits ranging from small benchmark circuits to large, highly complex and specialized circuits. The proposed dependency based partitioning of large analog circuits in the time domain allows for fast identification of highly sensitive transistors as well as provides a natural division of circuit components. Modeling analog circuits in the time domain with this partitioning technique and SVM learning algorithms allows for very fast transient behavior predictions, three orders of magnitude faster than traditional simulators, while maintaining 95% accuracy. Analog verification can be explored through a reduction of simulation time by utilizing the partitions, information and complexity measures, and input space reduction. Behavioral models are created using supervised learning techniques for detected primitive elements. We will show the effectiveness of the method on four analog circuits where the simulation time is decreased by 55-75%. Utilizing the reduced simulation method, critical nodes can be found quickly and efficiently. The nodes found using this method match those found by an experienced layout engineer, but are detected automatically given the design and input specifications. The technique is further extended to find the tolerance of transistors to both process variation and power supply fluctuation. This information allows for corrections in layout overdesign or guidance in placing noise reducing components such as guard rings or decoupling capacitors. The proposed approaches significantly reduce the simulation time required to perform the tasks traditionally, maintain high accuracy, and can be automated

    AI/ML Algorithms and Applications in VLSI Design and Technology

    Full text link
    An evident challenge ahead for the integrated circuit (IC) industry in the nanometer regime is the investigation and development of methods that can reduce the design complexity ensuing from growing process variations and curtail the turnaround time of chip manufacturing. Conventional methodologies employed for such tasks are largely manual; thus, time-consuming and resource-intensive. In contrast, the unique learning strategies of artificial intelligence (AI) provide numerous exciting automated approaches for handling complex and data-intensive tasks in very-large-scale integration (VLSI) design and testing. Employing AI and machine learning (ML) algorithms in VLSI design and manufacturing reduces the time and effort for understanding and processing the data within and across different abstraction levels via automated learning algorithms. It, in turn, improves the IC yield and reduces the manufacturing turnaround time. This paper thoroughly reviews the AI/ML automated approaches introduced in the past towards VLSI design and manufacturing. Moreover, we discuss the scope of AI/ML applications in the future at various abstraction levels to revolutionize the field of VLSI design, aiming for high-speed, highly intelligent, and efficient implementations

    Engineering Education and Research Using MATLAB

    Get PDF
    MATLAB is a software package used primarily in the field of engineering for signal processing, numerical data analysis, modeling, programming, simulation, and computer graphic visualization. In the last few years, it has become widely accepted as an efficient tool, and, therefore, its use has significantly increased in scientific communities and academic institutions. This book consists of 20 chapters presenting research works using MATLAB tools. Chapters include techniques for programming and developing Graphical User Interfaces (GUIs), dynamic systems, electric machines, signal and image processing, power electronics, mixed signal circuits, genetic programming, digital watermarking, control systems, time-series regression modeling, and artificial neural networks

    A Problem-Oriented Approach for Dynamic Verification of Heterogeneous Embedded Systems

    Get PDF
    This work presents a virtual prototyping methodology for the design and verification of industrial devices in the field level of industrial automation systems. This work demonstrates that virtual prototypes can help increase the confidence in the correctness of a design thanks to a deeper understanding of the complex interactions between hardware, software, analog and mixed-signal components of embedded systems and the physical processes they interact with

    Dynamics of Macrosystems; Proceedings of a Workshop, September 3-7, 1984

    Get PDF
    There is an increasing awareness of the important and persuasive role that instability and random, chaotic motion play in the dynamics of macrosystems. Further research in the field should aim at providing useful tools, and therefore the motivation should come from important questions arising in specific macrosystems. Such systems include biochemical networks, genetic mechanisms, biological communities, neutral networks, cognitive processes and economic structures. This list may seem heterogeneous, but there are similarities between evolution in the different fields. It is not surprising that mathematical methods devised in one field can also be used to describe the dynamics of another. IIASA is attempting to make progress in this direction. With this aim in view this workshop was held at Laxenburg over the period 3-7 September 1984. These Proceedings cover a broad canvas, ranging from specific biological and economic problems to general aspects of dynamical systems and evolutionary theory

    Applications

    Get PDF
    corecore