77 research outputs found

    Pulse Jitter, Delay Spread, and Doppler Shift in Mode-Stirred Reverberation

    Get PDF

    Uncertainty Quantification for Electromagnetic Analysis via Efficient Collocation Methods.

    Full text link
    Electromagnetic (EM) devices and systems often are fraught by uncertainty in their geometry, configuration, and excitation. These uncertainties (often termed “random variables”) strongly and nonlinearly impact voltages and currents on mission-critical circuits or receivers (often termed “observables”). To ensure the functionality of such circuits or receivers, this dependency should be statistically characterized. In this thesis, efficient collocation methods for uncertainty quantification in EM analysis are presented. First, a Stroud-based stochastic collocation method is introduced to statistically characterize electromagnetic compatibility and interference (EMC/EMI) phenomena on electrically large and complex platforms. Second, a multi-element probabilistic collocation (ME-PC) method suitable for characterizing rapidly varying and/or discontinuous observables is presented. Its applications to the statistical characterization of EMC/EMI phenomena on electrically and complex platforms and transverse magnetic wave propagation in complex mine environments are demonstrated. In addition, the ME-PC method is applied to the statistical characterization of EM wave propagation in complex mine environments with the aid of a novel fast multipole method and fast Fourier transform-accelerated surface integral equation solver -- the first-ever full-wave solver capable of characterizing EM wave propagation in hundreds of wavelengths long mine tunnels. Finally, an iterative high-dimensional model representation technique is proposed to statistically characterize EMC/EMI observables that involve a large number of random variables. The application of this technique to the genetic algorithm based optimization of EM devices is presented as well.PHDElectrical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/100086/1/acyucel_1.pd

    Design of textile antennas and flexible WBAN sensor systems for body-worn localization using impulse radio ultra-wideband

    Get PDF

    Learning Approaches to Analog and Mixed Signal Verification and Analysis

    Get PDF
    The increased integration and interaction of analog and digital components within a system has amplified the need for a fast, automated, combined analog, and digital verification methodology. There are many automated characterization, test, and verification methods used in practice for digital circuits, but analog and mixed signal circuits suffer from long simulation times brought on by transistor-level analysis. Due to the substantial amount of simulations required to properly characterize and verify an analog circuit, many undetected issues manifest themselves in the manufactured chips. Creating behavioral models, a circuit abstraction of analog components assists in reducing simulation time which allows for faster exploration of the design space. Traditionally, creating behavioral models for non-linear circuits is a manual process which relies heavily on design knowledge for proper parameter extraction and circuit abstraction. Manual modeling requires a high level of circuit knowledge and often fails to capture critical effects stemming from block interactions and second order device effects. For this reason, it is of interest to extract the models directly from the SPICE level descriptions so that these effects and interactions can be properly captured. As the devices are scaled, process variations have a more profound effect on the circuit behaviors and performances. Creating behavior models from the SPICE level descriptions, which include input parameters and a large process variation space, is a non-trivial task. In this dissertation, we focus on addressing various problems related to the design automation of analog and mixed signal circuits. Analog circuits are typically highly specialized and fined tuned to fit the desired specifications for any given system reducing the reusability of circuits from design to design. This hinders the advancement of automating various aspects of analog design, test, and layout. At the core of many automation techniques, simulations, or data collection are required. Unfortunately, for some complex analog circuits, a single simulation may take many days. This prohibits performing any type of behavior characterization or verification of the circuit. This leads us to the first fundamental problem with the automation of analog devices. How can we reduce the simulation cost while maintaining the robustness of transistor level simulations? As analog circuits can vary vastly from one design to the next and are hardly ever comprised of standard library based building blocks, the second fundamental question is how to create automated processes that are general enough to be applied to all or most circuit types? Finally, what circuit characteristics can we utilize to enhance the automation procedures? The objective of this dissertation is to explore these questions and provide suitable evidence that they can be answered. We begin by exploring machine learning techniques to model the design space using minimal simulation effort. Circuit partitioning is employed to reduce the complexity of the machine learning algorithms. Using the same partitioning algorithm we further explore the behavior characterization of analog circuits undergoing process variation. The circuit partitioning is general enough to be used by any CMOS based analog circuit. The ideas and learning gained from behavioral modeling during behavior characterization are used to improve the simulation through event propagation, input space search, complexity and information measurements. The reduction of the input space and behavioral modeling of low complexity, low information primitive elements reduces the simulation time of large analog and mixed signal circuits by 50-75%. The method is extended and applied to assist in analyzing analog circuit layout. All of the proposed methods are implemented on analog circuits ranging from small benchmark circuits to large, highly complex and specialized circuits. The proposed dependency based partitioning of large analog circuits in the time domain allows for fast identification of highly sensitive transistors as well as provides a natural division of circuit components. Modeling analog circuits in the time domain with this partitioning technique and SVM learning algorithms allows for very fast transient behavior predictions, three orders of magnitude faster than traditional simulators, while maintaining 95% accuracy. Analog verification can be explored through a reduction of simulation time by utilizing the partitions, information and complexity measures, and input space reduction. Behavioral models are created using supervised learning techniques for detected primitive elements. We will show the effectiveness of the method on four analog circuits where the simulation time is decreased by 55-75%. Utilizing the reduced simulation method, critical nodes can be found quickly and efficiently. The nodes found using this method match those found by an experienced layout engineer, but are detected automatically given the design and input specifications. The technique is further extended to find the tolerance of transistors to both process variation and power supply fluctuation. This information allows for corrections in layout overdesign or guidance in placing noise reducing components such as guard rings or decoupling capacitors. The proposed approaches significantly reduce the simulation time required to perform the tasks traditionally, maintain high accuracy, and can be automated

    Novel Approaches for Nondestructive Testing and Evaluation

    Get PDF
    Nondestructive testing and evaluation (NDT&E) is one of the most important techniques for determining the quality and safety of materials, components, devices, and structures. NDT&E technologies include ultrasonic testing (UT), magnetic particle testing (MT), magnetic flux leakage testing (MFLT), eddy current testing (ECT), radiation testing (RT), penetrant testing (PT), and visual testing (VT), and these are widely used throughout the modern industry. However, some NDT processes, such as those for cleaning specimens and removing paint, cause environmental pollution and must only be considered in limited environments (time, space, and sensor selection). Thus, NDT&E is classified as a typical 3D (dirty, dangerous, and difficult) job. In addition, NDT operators judge the presence of damage based on experience and subjective judgment, so in some cases, a flaw may not be detected during the test. Therefore, to obtain clearer test results, a means for the operator to determine flaws more easily should be provided. In addition, the test results should be organized systemically in order to identify the cause of the abnormality in the test specimen and to identify the progress of the damage quantitatively

    Air Force Institute of Technology Research Report 2003

    Get PDF
    This report summarizes the research activities of the Air Force Institute of Technology’s Graduate School of Engineering and Management. It describes research interests and faculty expertise; lists student theses/dissertations; identifies research sponsors and contributions; and outlines the procedures for contacting the school. Included in the report are: faculty publications, conference presentations, consultations, and funded research projects. Research was conducted in the areas of Aeronautical and Astronautical Engineering, Electrical Engineering and Electro-Optics, Computer Engineering and Computer Science, Systems and Engineering Management, Operational Sciences, and Engineering Physics

    Low-profile antenna systems for the Next-Generation Internet of Things applications

    Get PDF

    18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems: Proceedings

    Get PDF
    Proceedings of the 18th IEEE Workshop on Nonlinear Dynamics of Electronic Systems, which took place in Dresden, Germany, 26 – 28 May 2010.:Welcome Address ........................ Page I Table of Contents ........................ Page III Symposium Committees .............. Page IV Special Thanks ............................. Page V Conference program (incl. page numbers of papers) ................... Page VI Conference papers Invited talks ................................ Page 1 Regular Papers ........................... Page 14 Wednesday, May 26th, 2010 ......... Page 15 Thursday, May 27th, 2010 .......... Page 110 Friday, May 28th, 2010 ............... Page 210 Author index ............................... Page XII

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
    • …
    corecore