1,398 research outputs found

    Edge Detection and 3D Reconstruction Based on the Shape-from-Focus

    Get PDF
    Má práce vychází z průmyslového projektu, jehož cílem je postavit stroj pro přesnou manipulaci mikrokomponenty. Zmíněné mikrokomponenty jsou sledovány na základě hledání hran v obraze. Má práce popisuje přehled postupů používaných pro detekci hran v obraze a zároveň návrh algoritmu pro rekonstrukci povrchu mikrokomponent pomocí Shape-From-Focus v mikroskopickém prostředí. Použité obrázky byly pořízeny kamerou s telecentrickým objektivem s malou hloubkou ostrosti. Vyvinul jsem Shape-From-Focus algoritmus, který používá 3D konvoluční masku pro detekci hran a je schopný aproximovat povrchy bez struktury. Vyvinutá 3D konvoluční maska je založena na druhé derivaci obrazové funkce. V pokusech popisujících kalibraci kamery a pro opětovné zaostření optické soustavy byly použity rozličné metody pro detekci hran v obraze. V pokusech se také prezentují výsledky rekonstrukce povrchu pomocí navrženého Shape-From-Focus algoritmu.The work stems from the industrial project which aims to build the highly precise micro components assembly machine. The components are positioned via locating the edges in the image. The overview of the edge detection techniques and the design of the Shape-From-Focus algorithm in the microscopic environment are presented. The images used were captured with telecentric optics with a shallow Depth-of-Field. The Shape-From-Focus algorithm is developed together with the 3D convolutional mask and approximation of the surface in the textureless areas. The developed 3D convolutional filter is based on the seconds derivative of the image function. Various edge detection techniques are used in experiments to calibrate the camera and to refocus the optics. The experiments also show the surface reconstruction obtained by the Shape-From-Focus algorithm

    The discriminative functional mixture model for a comparative analysis of bike sharing systems

    Get PDF
    Bike sharing systems (BSSs) have become a means of sustainable intermodal transport and are now proposed in many cities worldwide. Most BSSs also provide open access to their data, particularly to real-time status reports on their bike stations. The analysis of the mass of data generated by such systems is of particular interest to BSS providers to update system structures and policies. This work was motivated by interest in analyzing and comparing several European BSSs to identify common operating patterns in BSSs and to propose practical solutions to avoid potential issues. Our approach relies on the identification of common patterns between and within systems. To this end, a model-based clustering method, called FunFEM, for time series (or more generally functional data) is developed. It is based on a functional mixture model that allows the clustering of the data in a discriminative functional subspace. This model presents the advantage in this context to be parsimonious and to allow the visualization of the clustered systems. Numerical experiments confirm the good behavior of FunFEM, particularly compared to state-of-the-art methods. The application of FunFEM to BSS data from JCDecaux and the Transport for London Initiative allows us to identify 10 general patterns, including pathological ones, and to propose practical improvement strategies based on the system comparison. The visualization of the clustered data within the discriminative subspace turns out to be particularly informative regarding the system efficiency. The proposed methodology is implemented in a package for the R software, named funFEM, which is available on the CRAN. The package also provides a subset of the data analyzed in this work.Comment: Published at http://dx.doi.org/10.1214/15-AOAS861 in the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Course development in IC manufacturing

    Get PDF
    A traditional curriculum in electrical engineering separates semiconductor processing courses from courses in circuit design. As a result, manufacturing topics involving yield management and the study of random process variations impacting circuit behaviour are usually vaguely treated. The subject matter of this paper is to report a course developed at Texas A&M University, USA, to compensate for the aforementioned shortcoming. This course attempts to link technological process and circuit design domains by emphasizing aspects such as process disturbance modeling, yield modeling, and defect-induced fault modeling. In a rapidly changing environment where high-end technologies are evolving towards submicron features and towards high transistor integration, these aspects are key factors to design for manufacturability. The paper presents the course's syllabus, a description of its main topics, and results on selected project assignments carried out during a normal academic semeste

    A Review of Bayesian Methods in Electronic Design Automation

    Full text link
    The utilization of Bayesian methods has been widely acknowledged as a viable solution for tackling various challenges in electronic integrated circuit (IC) design under stochastic process variation, including circuit performance modeling, yield/failure rate estimation, and circuit optimization. As the post-Moore era brings about new technologies (such as silicon photonics and quantum circuits), many of the associated issues there are similar to those encountered in electronic IC design and can be addressed using Bayesian methods. Motivated by this observation, we present a comprehensive review of Bayesian methods in electronic design automation (EDA). By doing so, we hope to equip researchers and designers with the ability to apply Bayesian methods in solving stochastic problems in electronic circuits and beyond.Comment: 24 pages, a draft version. We welcome comments and feedback, which can be sent to [email protected]

    Index to NASA Tech Briefs, 1975

    Get PDF
    This index contains abstracts and four indexes--subject, personal author, originating Center, and Tech Brief number--for 1975 Tech Briefs

    An introduction to low-level analysis methods of DNA microarray data

    Get PDF
    This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect

    Multi-level analysis of atomic layer deposition barrier coatings on additively manufactured plastics for high vacuum applications

    Get PDF
    While hardware innovations in micro/nano electronics and photonics are heavily patented, the rise of the open-source movement has significantly shifted focus to the importance of obtaining low-cost, functional and easily modifiable research equipment. This thesis provides a foundation of open source development of equipment to aid in the micro/nano electronics and photonics fields. First, the massive acceptance of the open source Arduino microcontroller has aided in the development of control systems with a wide variety of uses. Here it is used for the development of an open-source dual axis gimbal system. This system is used to characterize optoelectronic properties of thin transparent films at varying angles. Conventionally, the ubiquity of vacuum systems in semiconductor fabrication has precluded the development of an open-source development in the “fab” environment and thus has high foundational and operational costs. In order to make vacuum systems and their components cost-effective in a research environment there has been a paradigm shift towards refurbishing and repairing instead of replacing legacy systems. These legacy systems are built, and operate on the principle that the vacuum industry is a small industry, and hence only a small number of sizes and types of parts may be used to reduce costs. The assumption that the vacuum industry is a small industry is no longer valid. The semiconductor industry alone, which is a subset of the vacuum industry, was worth over USD 481b and increasing. Hence,there is a need to not only introduce new methods but also new materials that make up these systems. Additive manufacturing is a low-waste, low-capital cost way to make custom equipment. The most popular materials used in additive manufacturing processes are polymer blends. 3-D printing using Fused Filament Fabrication (FFF) methods has been used to create custom objects for laboratories. However, the use of polymer-based materials is conspicuously absent in the development of vacuum systems, especially those that are used for semiconductor fabrication. There are two major problems identified when polymeric materials are used to make vacuum systems: finding a way to prevent outgassing (which can subsequently lead to contamination), and sealing them so that they can hold a vacuum. This work has demonstrated how an inorganic barrier layer introduced via Atomic Layer Deposition (ALD) can alleviate outgassing to a large extent under high vacuum levels (1E-6 to 1E-7 torr). Recognizing the importance of ALD alumina in back end of the line (BEOL) semiconductor processing, films were deposited on 3-D printed polymer-based substrates with differing constituents. These samples were tested in a bespoke gas analysis chamber for outgassing characterization. Surface and bulk characterization was completed using various tools such as scanning electron microscopy (SEM), energy dispersive x-ray analysis (EDX), x-ray photoelectron spectroscopy (XPS), attenuated total reflectance - Fourier transform infrared spectroscopy (ATR-FTIR) and others. Additionally, spectroscopic ellipsometry (SE) was used to understand how the concept of thickness of a film deposited on a porous polymer-based sample does not correlate directly with its conventional definition. Also, an effort is made to understand the mechanism of ALD alumina deposition on porous plastic surfaces.It was concluded that this deposition is a complex amalgamation of physical and chemical properties of both the polymer and the precursor gases. Finally, recommendations are made for AM materials to be used in vacuum systems

    Fault Classification of Nonlinear Small Sample Data through Feature Sub-Space Neighbor Vote

    Get PDF
    The fault classification of a small sample of high dimension is challenging, especially for a nonlinear and non-Gaussian manufacturing process. In this paper, a similarity-based feature selection and sub-space neighbor vote method is proposed to solve this problem. To capture the dynamics, nonlinearity, and non-Gaussianity in the irregular time series data, high order spectral features, and fractal dimension features are extracted, selected, and stacked in a regular matrix. To address the problem of a small sample, all labeled fault data are used for similarity decisions for a specific fault type. The distances between the new data and all fault types are calculated in their feature subspaces. The new data are classified to the nearest fault type by majority probability voting of the distances. Meanwhile, the selected features, from respective measured variables, indicate the cause of the fault. The proposed method is evaluated on a publicly available benchmark of a real semiconductor etching dataset. It is demonstrated that by using the high order spectral features and fractal dimensionality features, the proposed method can achieve more than 84% fault recognition accuracy. The resulting feature subspace can be used to match any new fault data to the fingerprint feature subspace of each fault type, and hence can pinpoint the root cause of a fault in a manufacturing process
    corecore