3 research outputs found
Combining business process and failure modelling to increase yield in electronics manufacturing
The prediction and capturing of defects in low-volume assembly of electronics is
a technical challenge that is a prerequisite for design for manufacturing (DfM) and business
process improvement (BPI) to increase first-time yields and reduce production costs. Failures
at the component-level (component defects) and system-level (such as defects in design and
manufacturing) have not been incorporated in combined prediction models. BPI efforts should
have predictive capability while supporting flexible production and changes in business models.
This research was aimed at the integration of enterprise modelling (EM) and failure models (FM)
to support business decision making by predicting system-level defects. An enhanced business
modelling approach which provides a set of accessible failure models at a given business process
level is presented in this article. This model-driven approach allows the evaluation of product
and process performance and hence feedback to design and manufacturing activities hence
improving first-time yield and product quality. A case in low-volume, high-complexity electronics
assembly industry shows how the approach leverages standard modelling techniques
and facilitates the understanding of the causes of poor manufacturing performance using a
set of surface mount technology (SMT) process failure models. A prototype application tool
was developed and tested in a collaborator site to evaluate the integration of business process
models with the execution entities, such as software tools, business database, and simulation
engines. The proposed concept was tested for the defect data collection and prediction in the
described case study
A Hierarchical, Fuzzy Inference Approach to Data Filtration and Feature Prioritization in the Connected Manufacturing Enterprise
The current big data landscape is one such that the technology and capability to capture and storage of data has preceded and outpaced the corresponding capability to analyze and interpret it. This has led naturally to the development of elegant and powerful algorithms for data mining, machine learning, and artificial intelligence to harness the potential of the big data environment. A competing reality, however, is that limitations exist in how and to what extent human beings can process complex information. The convergence of these realities is a tension between the technical sophistication or elegance of a solution and its transparency or interpretability by the human data scientist or decision maker. This dissertation, contextualized in the connected manufacturing enterprise, presents an original Fuzzy Approach to Feature Reduction and Prioritization (FAFRAP) approach that is designed to assist the data scientist in filtering and prioritizing data for inclusion in supervised machine learning models. A set of sequential filters reduces the initial set of independent variables, and a fuzzy inference system outputs a crisp numeric value associated with each feature to rank order and prioritize for inclusion in model training. Additionally, the fuzzy inference system outputs a descriptive label to assist in the interpretation of the feature’s usefulness with respect to the problem of interest. Model testing is performed using three publicly available datasets from an online machine learning data repository and later applied to a case study in electronic assembly manufacture. Consistency of model results is experimentally verified using Fisher’s Exact Test, and results of filtered models are compared to results obtained by the unfiltered sets of features using a proposed novel metric of performance-size ratio (PSR)