17,218 research outputs found

    Innovative Techniques for Testing and Diagnosing SoCs

    Get PDF
    We rely upon the continued functioning of many electronic devices for our everyday welfare, usually embedding integrated circuits that are becoming even cheaper and smaller with improved features. Nowadays, microelectronics can integrate a working computer with CPU, memories, and even GPUs on a single die, namely System-On-Chip (SoC). SoCs are also employed on automotive safety-critical applications, but need to be tested thoroughly to comply with reliability standards, in particular the ISO26262 functional safety for road vehicles. The goal of this PhD. thesis is to improve SoC reliability by proposing innovative techniques for testing and diagnosing its internal modules: CPUs, memories, peripherals, and GPUs. The proposed approaches in the sequence appearing in this thesis are described as follows: 1. Embedded Memory Diagnosis: Memories are dense and complex circuits which are susceptible to design and manufacturing errors. Hence, it is important to understand the fault occurrence in the memory array. In practice, the logical and physical array representation differs due to an optimized design which adds enhancements to the device, namely scrambling. This part proposes an accurate memory diagnosis by showing the efforts of a software tool able to analyze test results, unscramble the memory array, map failing syndromes to cell locations, elaborate cumulative analysis, and elaborate a final fault model hypothesis. Several SRAM memory failing syndromes were analyzed as case studies gathered on an industrial automotive 32-bit SoC developed by STMicroelectronics. The tool displayed defects virtually, and results were confirmed by real photos taken from a microscope. 2. Functional Test Pattern Generation: The key for a successful test is the pattern applied to the device. They can be structural or functional; the former usually benefits from embedded test modules targeting manufacturing errors and is only effective before shipping the component to the client. The latter, on the other hand, can be applied during mission minimally impacting on performance but is penalized due to high generation time. However, functional test patterns may benefit for having different goals in functional mission mode. Part III of this PhD thesis proposes three different functional test pattern generation methods for CPU cores embedded in SoCs, targeting different test purposes, described as follows: a. Functional Stress Patterns: Are suitable for optimizing functional stress during I Operational-life Tests and Burn-in Screening for an optimal device reliability characterization b. Functional Power Hungry Patterns: Are suitable for determining functional peak power for strictly limiting the power of structural patterns during manufacturing tests, thus reducing premature device over-kill while delivering high test coverage c. Software-Based Self-Test Patterns: Combines the potentiality of structural patterns with functional ones, allowing its execution periodically during mission. In addition, an external hardware communicating with a devised SBST was proposed. It helps increasing in 3% the fault coverage by testing critical Hardly Functionally Testable Faults not covered by conventional SBST patterns. An automatic functional test pattern generation exploiting an evolutionary algorithm maximizing metrics related to stress, power, and fault coverage was employed in the above-mentioned approaches to quickly generate the desired patterns. The approaches were evaluated on two industrial cases developed by STMicroelectronics; 8051-based and a 32-bit Power Architecture SoCs. Results show that generation time was reduced upto 75% in comparison to older methodologies while increasing significantly the desired metrics. 3. Fault Injection in GPGPU: Fault injection mechanisms in semiconductor devices are suitable for generating structural patterns, testing and activating mitigation techniques, and validating robust hardware and software applications. GPGPUs are known for fast parallel computation used in high performance computing and advanced driver assistance where reliability is the key point. Moreover, GPGPU manufacturers do not provide design description code due to content secrecy. Therefore, commercial fault injectors using the GPGPU model is unfeasible, making radiation tests the only resource available, but are costly. In the last part of this thesis, we propose a software implemented fault injector able to inject bit-flip in memory elements of a real GPGPU. It exploits a software debugger tool and combines the C-CUDA grammar to wisely determine fault spots and apply bit-flip operations in program variables. The goal is to validate robust parallel algorithms by studying fault propagation or activating redundancy mechanisms they possibly embed. The effectiveness of the tool was evaluated on two robust applications: redundant parallel matrix multiplication and floating point Fast Fourier Transform

    EEG-based cognitive control behaviour assessment: an ecological study with professional air traffic controllers

    Get PDF
    Several models defining different types of cognitive human behaviour are available. For this work, we have selected the Skill, Rule and Knowledge (SRK) model proposed by Rasmussen in 1983. This model is currently broadly used in safety critical domains, such as the aviation. Nowadays, there are no tools able to assess at which level of cognitive control the operator is dealing with the considered task, that is if he/she is performing the task as an automated routine (skill level), as procedures-based activity (rule level), or as a problem-solving process (knowledge level). Several studies tried to model the SRK behaviours from a Human Factor perspective. Despite such studies, there are no evidences in which such behaviours have been evaluated from a neurophysiological point of view, for example, by considering brain activity variations across the different SRK levels. Therefore, the proposed study aimed to investigate the use of neurophysiological signals to assess the cognitive control behaviours accordingly to the SRK taxonomy. The results of the study, performed on 37 professional Air Traffic Controllers, demonstrated that specific brain features could characterize and discriminate the different SRK levels, therefore enabling an objective assessment of the degree of cognitive control behaviours in realistic setting

    AI/ML Algorithms and Applications in VLSI Design and Technology

    Full text link
    An evident challenge ahead for the integrated circuit (IC) industry in the nanometer regime is the investigation and development of methods that can reduce the design complexity ensuing from growing process variations and curtail the turnaround time of chip manufacturing. Conventional methodologies employed for such tasks are largely manual; thus, time-consuming and resource-intensive. In contrast, the unique learning strategies of artificial intelligence (AI) provide numerous exciting automated approaches for handling complex and data-intensive tasks in very-large-scale integration (VLSI) design and testing. Employing AI and machine learning (ML) algorithms in VLSI design and manufacturing reduces the time and effort for understanding and processing the data within and across different abstraction levels via automated learning algorithms. It, in turn, improves the IC yield and reduces the manufacturing turnaround time. This paper thoroughly reviews the AI/ML automated approaches introduced in the past towards VLSI design and manufacturing. Moreover, we discuss the scope of AI/ML applications in the future at various abstraction levels to revolutionize the field of VLSI design, aiming for high-speed, highly intelligent, and efficient implementations

    Exploring the Mysteries of System-Level Test

    Full text link
    System-level test, or SLT, is an increasingly important process step in today's integrated circuit testing flows. Broadly speaking, SLT aims at executing functional workloads in operational modes. In this paper, we consolidate available knowledge about what SLT is precisely and why it is used despite its considerable costs and complexities. We discuss the types or failures covered by SLT, and outline approaches to quality assessment, test generation and root-cause diagnosis in the context of SLT. Observing that the theoretical understanding for all these questions has not yet reached the level of maturity of the more conventional structural and functional test methods, we outline new and promising directions for methodical developments leveraging on recent findings from software engineering.Comment: 7 pages, 2 figure

    NOVEL STRATEGIES FOR THE MORPHOLOGICAL AND BIOMECHANICAL ANALYSIS OF THE CARDIAC VALVES BASED ON VOLUMETRIC CLINICAL IMAGES

    Get PDF
    This work was focused on the morphological and biomechanical analysis of the heart valves exploiting the volumetric data. Novel methods were implemented to perform cardiac valve structure and sub-structure segmentation by defining long axis planes evenly rotated around the long axis of the valve. These methods were exploited to successfully reconstruct the 3D geometry of the mitral, tricuspid and aortic valve structures. Firstly, the reconstructed models were used for the morphological analysis providing a detailed description of the geometry of the valve structures, also computing novel indexes that could improve the description of the valvular apparatus and help their clinical assessment. Additionally, the models obtained for the mitral valve complex were adopted for the development of a novel biomechanical approach to simulate the systolic closure of the valve, relying on highly-efficient mass-spring models thus obtaining a good trade-off between the accuracy and the computational cost of the numerical simulations. In specific: \u2022 First, an innovative and semi-automated method was implemented to generate the 3D model of the aortic valve and of its calcifications, to quantitively describe its 3D morphology and to compute the anatomical aortic valve area (AVA) based on multi-detector computed tomography images. The comparison of the obtained results vs. effective AVA measurements showed a good correlation. Additionally, these methods accounted for asymmetries or anatomical derangements, which would be difficult to correctly capture through either effective AVA or planimetric AVA. \u2022 Second, a tool to quantitively assess the geometry of the tricuspid valve during the cardiac cycle using multidetector CT was developed, in particular focusing on the 3D spatial relationship between the tricuspid annulus and the right coronary artery. The morphological analysis of the annulus and leaflets confirmed data reported in literature. The qualitative and quantitative analysis of the spatial relationship could standardize the analysis protocol and be pivotal in the procedure planning of the percutaneous device implantation that interact with the tricuspid annulus. \u2022 Third, we simulated the systolic closure of three patient specific mitral valve models, derived from CMR datasets, by means of the mass spring model approach. The comparison of the obtained results vs. finite element analyses (considered as the gold-standard) was performed tuning the parameters of the mass spring model, so to obtain the best trade-off between computational expense and accuracy of the results. A configuration mismatch between the two models lower than two times the in-plane resolution of starting imaging data was yielded using a mass spring model set-up that requires, on average, only ten minutes to simulate the valve closure. \u2022 Finally, in the last chapter, we performed a comprehensive analysis which aimed at exploring the morphological and mechanical changes induced by the myxomatous pathologies in the mitral valve tissue. The analysis of mitral valve thickness confirmed the data and patterns reported in literature, while the mechanical test accurately described the behavior of the pathological tissue. A preliminary implementation of this data into finite element simulations suggested that the use of more reliable patient-specific and pathology-specific characterization of the model could improve the realism and the accuracy of the biomechanical simulations

    NDE: An effective approach to improved reliability and safety. A technology survey

    Get PDF
    Technical abstracts are presented for about 100 significant documents relating to nondestructive testing of aircraft structures or related structural testing and the reliability of the more commonly used evaluation methods. Particular attention is directed toward acoustic emission; liquid penetrant; magnetic particle; ultrasonics; eddy current; and radiography. The introduction of the report includes an overview of the state-of-the-art represented in the documents that have been abstracted

    Aeronautical Engineering: A special bibliography with indexes, supplement 64, December 1975

    Get PDF
    This bibliography lists 288 reports, articles, and other documents introduced into the NASA scientific and technical information system in November 1975

    A new perspective for the training assessment: Machine learning-based neurometric for augmented user's evaluation

    Get PDF
    Inappropriate training assessment might have either high social costs and economic impacts, especially in high risks categories, such as Pilots, Air Traffic Controllers, or Surgeons. One of the current limitations of the standard training assessment procedures is the lack of information about the amount of cognitive resources requested by the user for the correct execution of the proposed task. In fact, even if the task is accomplished achieving the maximum performance, by the standard training assessment methods, it would not be possible to gather and evaluate information about cognitive resources available for dealing with unexpected events or emergency conditions. Therefore, a metric based on the brain activity (neurometric) able to provide the Instructor such a kind of information should be very important. As a first step in this direction, the Electroencephalogram (EEG) and the performance of 10 participants were collected along a training period of 3 weeks, while learning the execution of a new task. Specific indexes have been estimated from the behavioral and EEG signal to objectively assess the users' training progress. Furthermore, we proposed a neurometric based on a machine learning algorithm to quantify the user's training level within each session by considering the level of task execution, and both the behavioral and cognitive stabilities between consecutive sessions. The results demonstrated that the proposed methodology and neurometric could quantify and track the users' progresses, and provide the Instructor information for a more objective evaluation and better tailoring of training programs. © 2017 Borghini, Aricò, Di Flumeri, Sciaraffa, Colosimo, Herrero, Bezerianos, Thakor and Babiloni

    On the test of single via related defects in digital VLSI designs

    Get PDF
    Vias are critical for digital circuit manufacturing, as they represent a common defect location, and a general DfM rule suggests replicating every instance for redundancy. When this is not achievable, a mandatory requirement is that the remaining single vias must be tested. We propose an automated method for generating tests and accurately evaluating test coverage of such defects, ready for use in any digital implementation flow and for integration within EDA tools, and also providing a useful quality metric. A prototype tool implementation and experimental results for an industrial case study are presented
    corecore