572 research outputs found

    AI/ML Algorithms and Applications in VLSI Design and Technology

    Full text link
    An evident challenge ahead for the integrated circuit (IC) industry in the nanometer regime is the investigation and development of methods that can reduce the design complexity ensuing from growing process variations and curtail the turnaround time of chip manufacturing. Conventional methodologies employed for such tasks are largely manual; thus, time-consuming and resource-intensive. In contrast, the unique learning strategies of artificial intelligence (AI) provide numerous exciting automated approaches for handling complex and data-intensive tasks in very-large-scale integration (VLSI) design and testing. Employing AI and machine learning (ML) algorithms in VLSI design and manufacturing reduces the time and effort for understanding and processing the data within and across different abstraction levels via automated learning algorithms. It, in turn, improves the IC yield and reduces the manufacturing turnaround time. This paper thoroughly reviews the AI/ML automated approaches introduced in the past towards VLSI design and manufacturing. Moreover, we discuss the scope of AI/ML applications in the future at various abstraction levels to revolutionize the field of VLSI design, aiming for high-speed, highly intelligent, and efficient implementations

    NOVELTY DETECTION FOR PREDICTIVE MAINTENANCE

    Get PDF
    Since the advent of Industry 4. 0 significant research has been conducted to apply machine learning to the vast array of Internet of Things (IoT) data produced by Industrial Machines. One such topic is to Predictive Maintenance. Unlike some other machine learning domains such as NLP and computer vision, Predictive Maintenance is a relatively new area of focus. Most of the published work demonstrates the effectiveness of supervised classification for predictive maintenance. Some of the challenges highlighted in the literature are the cost and difficulty of obtaining labelled samples for training. Novelty detection is a branch of machine learning that after being trained on normal operations detects if new data comes from the same process or is different, eliminating the requirement to label data points. This thesis applies novelty detection to both a public data set and one that was specifically collected to demonstrate a its application to predictive maintenance. The Local Optimization Factor showed better performance than a One-Class SVM on the public data. It was then applied to data from a 3-D printer and was able to detect faults it had not been trained on showing a slight lift from a random classifier

    How to Do Machine Learning with Small Data? -- A Review from an Industrial Perspective

    Full text link
    Artificial intelligence experienced a technological breakthrough in science, industry, and everyday life in the recent few decades. The advancements can be credited to the ever-increasing availability and miniaturization of computational resources that resulted in exponential data growth. However, because of the insufficient amount of data in some cases, employing machine learning in solving complex tasks is not straightforward or even possible. As a result, machine learning with small data experiences rising importance in data science and application in several fields. The authors focus on interpreting the general term of "small data" and their engineering and industrial application role. They give a brief overview of the most important industrial applications of machine learning and small data. Small data is defined in terms of various characteristics compared to big data, and a machine learning formalism was introduced. Five critical challenges of machine learning with small data in industrial applications are presented: unlabeled data, imbalanced data, missing data, insufficient data, and rare events. Based on those definitions, an overview of the considerations in domain representation and data acquisition is given along with a taxonomy of machine learning approaches in the context of small data

    A Multimodal Deep Learning-Based Fault Detection Model for a Plastic Injection Molding Process

    Get PDF
    The authors of this work propose a deep learning-based fault detection model that can be implemented in the field of plastic injection molding. Compared to conventional approaches to fault detection in this domain, recent deep learning approaches prove useful for on-site problems involving complex underlying dynamics with a large number of variables. In addition, the advent of advanced sensors that generate data types in multiple modalities prompts the need for multimodal learning with deep neural networks to detect faults. This process is able to facilitate information from various modalities in an end-to-end learning fashion. The proposed deep learning-based approach opts for an early fusion scheme, in which the low-level feature representations of modalities are combined. A case study involving real-world data, obtained from a car parts company and related to a car window side molding process, validates that the proposed model outperforms late fusion methods and conventional models in solving the problem

    Business analytics in industry 4.0: a systematic review

    Get PDF
    Recently, the term “Industry 4.0” has emerged to characterize several Information Technology and Communication (ICT) adoptions in production processes (e.g., Internet-of-Things, implementation of digital production support information technologies). Business Analytics is often used within the Industry 4.0, thus incorporating its data intelligence (e.g., statistical analysis, predictive modelling, optimization) expert system component. In this paper, we perform a Systematic Literature Review (SLR) on the usage of Business Analytics within the Industry 4.0 concept, covering a selection of 169 papers obtained from six major scientific publication sources from 2010 to March 2020. The selected papers were first classified in three major types, namely, Practical Application, Reviews and Framework Proposal. Then, we analysed with more detail the practical application studies which were further divided into three main categories of the Gartner analytical maturity model, Descriptive Analytics, Predictive Analytics and Prescriptive Analytics. In particular, we characterized the distinct analytics studies in terms of the industry application and data context used, impact (in terms of their Technology Readiness Level) and selected data modelling method. Our SLR analysis provides a mapping of how data-based Industry 4.0 expert systems are currently used, disclosing also research gaps and future research opportunities.The work of P. Cortez was supported by FCT - Fundação para a Ciência e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020. We would like to thank to the three anonymous reviewers for their helpful suggestions

    A Review of Bayesian Methods in Electronic Design Automation

    Full text link
    The utilization of Bayesian methods has been widely acknowledged as a viable solution for tackling various challenges in electronic integrated circuit (IC) design under stochastic process variation, including circuit performance modeling, yield/failure rate estimation, and circuit optimization. As the post-Moore era brings about new technologies (such as silicon photonics and quantum circuits), many of the associated issues there are similar to those encountered in electronic IC design and can be addressed using Bayesian methods. Motivated by this observation, we present a comprehensive review of Bayesian methods in electronic design automation (EDA). By doing so, we hope to equip researchers and designers with the ability to apply Bayesian methods in solving stochastic problems in electronic circuits and beyond.Comment: 24 pages, a draft version. We welcome comments and feedback, which can be sent to [email protected]
    corecore